Format JSON file from 2 columns - json

I would like to create a JSON file for a Python script to parse.
My data is currently in a text file in the format of:
url1,string1
url2,string2
url3,string3
url4,string4
I would like to manually create a JSON file that I could input against a Python script to scrape for a string.
Thank you, I used your example to build something like it and it worked!
{"url": "url1", "string": "string1"} {"url": "url2", "string": "string2"} {"url": "url3", "string": "string3"}
Thanks

Something like the following should work
import csv
import json
csv_file = open('file.csv', 'r')
json_file = open('file.json', 'w')
field_names = ("url", "string")
reader = csv.DictReader(csv_file, field_names)
for row in reader:
json.dump(row, json_file)
json_file.write('\n')

I may misunderstand your question, if it's converting this CSV into a JSON manually, it would be :
[
[
"url1",
"string1"
],
[
"url2",
"string2"
],
[
"url3",
"string3"
],
[
"url4",
"string4"
]
]
If you prefer you can use CSV to JSON converter online

Related

Reading Local JSON data

So how to read the local JSON data if the structure is like this-
{
"employees":
[ "id":1,
"name":Mike,
"location":[
"first-address":India,
"Sec-address":Bangalore
]
]
}
how to access location using the angular Framework?
You have not provided much info in the question but Im guessing you've got a string and you want to convert it to JSON which you do with JSON.parse
const obj = JSON.parse('{ "employees": [ "id":1, "name":Mike, "location":[ "first-address":India, "Sec-address":Bangalore ] ] }')
Now you can access as follows:
console.log(obj.employees[0].id)
Or if not in string format:
const obj = { "employees": [ "id":1, "name":Mike, "location":[ "first-address":India, "Sec-address":Bangalore ] ] }
console.log(obj.employees[0].id)
The question needs more clarification, according to my understanding
if you want to convert string format into JSON with JSON.parse(object) and store it in variable named obj
and you can access location attribute by
console.log(obj.employees[0].location)

How to extract in python3 a signle value from json key with multiple values?

import json
person = '{"name": "Bob", "languages": ["Italian", "English", "Fench"], "location": "Naples"}'
person_dict = json.loads(person)
print(list(person_dict.values())[1:1])
Hi guys i've a little issue handling json value in py3.
The question is simple.. considering the code above, how i can extract only 'Italian' value from 'languages' key?
The code is surely wrong because it give nothing:
[]
Any help will be appreciated.
Edit:
The imported pkgs:
from urllib3 import PoolManager, request
import certifi
import json2
the right api output in json format:
{
"error": [],
"result": {
"AUCTION1": {
"asks": [
[
"281.00000",
"0.163",
1609860353
]
],
"bids": [
[
"277.60000",
"0.100",
1609860353
]
]
}
}
}
And this is the function where i've the issue:
p_urll = PoolManager(ca_certs=certifi.where())
p_req = "https://api.auctionsite.com/0/public/Depth?pair=AUCTION1&count=1"
p_api_req = p_urll.request('GET', p_req)
p_api_res = json2.loads(p_api_req.data.decode('utf-8'))
print(p_api_res['result']['AUCTION1']['asks'][0])
It's not simple like the 1st example.. my fault..
Using [0] the code will give this result:
['26098.60000', '0.781', 1609861809]
i need only the auction price, so the 1st "asks" value (this "281.00000" without quotes)

Inserting json file into Cassandra table

I am currently using the Cassandra-Ruby driver to insert data from a JSON file into an existing table in my database.
the JSON file looks like this:
[
{
"id": "123",
"destination": "234",
"type": "equipment",
"support": "type 1",
"test": "test1"
},
{
"id": "234",
"destination": "123",
"type": "equipment",
"support": "type 1",
"test": "test1"
}
]
I am reading in the file like this:
file = File.read('itemType.json')
data_hash = JSON.parse(file) #return an array of hashes
Iterate through the array and get each hash
and insert each hash onto the table
data_hash.each do |has|
#check the type of each object
#puts has.class #return hash
insert_statement = session.prepare('INSERT INTO keyspace.table JSON ?')
session.execute(insert_statement, [has]) #error occurs here
end
After running this code, I get this error message
in `assert_instance_of': options must be a Hash
I checked that each object being inserted in the table is a hash, so I'm not sure why I'm getting this issue.
You are saying that you are inserting a JSON but you are not, you are trying to insert an object. See this example from the documentation:
INSERT INTO cycling.cyclist_category JSON '{
"category" : "Sprint",
"points" : 700,
"id" : "829aa84a-4bba-411f-a4fb-38167a987cda"
}';
You have to give it a json format if you do it like that.
using .to_json add \ escape character. This gave me error
INSERT INTO organization_metadata JSON '{\"id\":9150,\"destroyed\":false,\"name\":\"ABC\",\"timestamp\":1510541801000000000}';
and the following worked.
INSERT INTO organization_metadata JSON '{"id":9150,"destroyed":false,"name":"ABC","timestamp":1510541801000000000}';

Read JSON files from multiple line file in spark scala

I'm learning spark in Scala. I have a JSON file as follows:
[
{
"name": "ali",
"age": "13",
"phone": "09123455737",
"sex": "m"
},{
"name": "amir",
"age": "24",
"phone": "09123475737",
"sex": "m"
}
]
and there is just this code:
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val jsonFile = sqlContext.read.json("path-to-json-file")
I just receive corrupted_row : String nothing else
but when put every person(or objects) in single row, code works fine
How can I read from multiple lines for a JSON sqlContext in spark?
You will have to read it into an RDD yourself and then convert it to a Dataset:
spark.read.json(sparkContext.wholeTextFiles(...).values)
This problem is getting caused because you have multiline json row. Although by default spark.read.json expect a row to be in a single line but this is configurable:
You can set option spark.read.json("path-to-json-file").option("multiLine", true)

How to get a particular field from JSON in Ruby

I need to implement a simple shell utility in Ruby which parses JSON from a file and return a particular field from it.
JSON examples to be parsed:
{"status": "fail", "messages": ["Out of capacity"]}
{"status": "success", "messages": [], "result": {"node": {"ip": "1.2.3.4", "description": "", "id": 974, "name": "VM#3"}}}
Idea is to create a CLI utility with two parameters: JSON file to read and field from JSON to extract:
./get_json_field.rb ~/tmp.XXXXXX 'result.node.ip'
./get_json_field.rb ~/tmp.XXXXXX 'messages.0'
I'm struggling how to map 2nd parameter to parsed JSON data structure in Ruby. I can write an iterator for sure, splitting string to an array using dot as separator an go through it item by item but this doesn't look like elegant solution.
Any suggestions for more elegant way?
There is nothing wrong with splitting string and going through parts of it:
require 'json'
data1 = JSON.load('{"status": "fail", "messages": ["Out of capacity"]}')
data2 = JSON.load('{"status": "success", "messages": [], "result": {"node": {"ip": "1.2.3.4", "description": "", "id": 974, "name": "VM#3"}}}')
def get_from_json(data, query)
query.split('.').inject(data) do |memo, key|
key = key.to_i if memo.is_a? Array
memo.fetch(key)
end
end
get_from_json(data1, 'messages.0') # => "Out of capacity"
get_from_json(data2, 'result.node.ip') # => "1.2.3.4"
Take a look at jq it might already do what you are looking for.
jq .messages[0]
jq .node.message.ip
See http://stedolan.github.com/jq/