How do I access this JSON API data in Ruby? - json

I am writing a short Ruby program that is going to take a zipcode and return the names of cities within 2 miles of that zipcode. I successfully called an API and was able to parse the JSON data, but I'm unsure how to access the 'city' key.
url = API call (not going to replicate here since it requires a key)
uri = URI(url)
response = Net::HTTP.get(uri)
JSON.parse(response)
Here's what my JSON looks like.
{
"results": [
{
"zip": "08225",
"city": "Northfield",
"county": "Atlantic",
"state": "NJ",
"distance": "0.0"
},
{
"zip": "08221",
"city": "Linwood",
"county": "Atlantic",
"state": "NJ",
"distance": "1.8"
}
]
}
I've been trying to access 'city' like this:
response['result'][0]['city']
This appears to be incorrect. Also tried
response[0][0]['city']
And a couple of other permutations of the same code.
How can I get the value 'Northfield' out of the JSON data?

You're almost there, just use results instead of result on the result of JSON.parse(response) instead of on response:
JSON.parse(response)["results"][0]["city"]
#=> "Northfield"

JSON parse will create a hash then you can target the results which is an array of hashes, like so:
hash = JSON.parse(response)
hash['results'].select{|h| h['city'] == 'Northfield'}
Or if you only care about the results:
array = JSON.parse(response)['results']
array.select{|a| a['city' == 'Northfield'} #
To get just a single data point from the data, you might select one item in the array and then the key of the value you want:
array[0]['city']
For all the cities
cities = array.map{|k,v| k['city']}

You have a typo error, instead of response['result'] you can use it like response[:results].
And if you want to get the value of city key from all the hash, then response['result'][0]['city'] will not work.
After parsing response you will get an array of hashes, i.e
[{:zip=>"08225", :city=>"Northfield", :county=>"Atlantic", :state=>"NJ", :distance=>"0.0"}, {:zip=>"08221", :city=>"Linwood", :county=>"Atlantic", :state=>"NJ", :distance=>"1.8"}]
And if you want to fetch the values of key city from all the hash then you can try this steps
response[:results].map{|x| x[:city]}
which will give the result
["Atlantic", "Atlantic"]

Related

Jmeter combine Json extractor variables to pass into request

There's a certain web request which has the following response:
{
"data": {
"articles": [
{
"id": "1355",
"slug": "smart-device-connectivity's-impact-on-homes-workplaces",
"title": "Smart device connectivity's impact on homes, workplaces",
"published_at": "2022-01-28T21:30:00.000Z",
"avg_rating": 0,
"click_count": 60,
},
{
"id": "1363",
"slug": "you-need-to-nurture-and-amplify-human-capabilities",
"title": "You need to nurture and amplify human capabilities",
"published_at": "2022-01-28T19:00:00.000Z",
"avg_rating": 0,
"click_count": 22,
}]}}
There are a total of 702 records which may increase or decrease over the coming months. Now I have been successfully able to extract ID & slug into separate variables. My aim is to pass these two variables into another request in the following format so that I can eventually run that 702 times or number of times = ID array or slug array size:
testurl.com/insight/${id}/${slug}
Example:
testurl.com/insight/1355/smart-device-connectivity's-impact-on-homes-workplaces
testurl.com/insight/1363/you-need-to-nurture-and-amplify-human-capabilities
I made use of Foreach controller & was able to pass slug but ID does not work. Does anyone know the solution?
If you're using ForEach Controller for iterating slug variable the id one needs to be handed a little bit differently:
use __jm__ForEach Controller__idx pre-defined variable to get current iteration of the ForEach Controller
use __intSum() function to increment it by 1 as the above variable is zero-based
use __V() function to calculate the value of id_x variable
putting everything together:
testurl.com/insight/${__V(id_${__intSum(${__jm__ForEach Controller__idx},1,)},)}/${slug}
What error do you get?
I was able to emulate the same
I saved your json in a variable
Foreach controller
another JSON inside foreach
Using the extracted values
Overall JMX structure
Final output

Check if a value exists in a json file with python

I've the following json file (banneds.json):
{
"players": [
{
"avatar": "https://steamcdn-a.akamaihd.net/steamcommunity/public/images/avatars/07/07aa315f664efa92456569429230bc2c254c3ff8_full.jpg",
"created": 1595050663,
"created_by": "<#128152620136267776>",
"nick": "teste",
"steam64": 76561198046619692
},
{
"avatar": "https://steamcdn-a.akamaihd.net/steamcommunity/public/images/avatars/21/21fa5c468597e9c890212b2e3bdb0fac781c040c_full.jpg",
"created": 1595056420,
"created_by": "<#128152620136267776>",
"nick": "ingridão",
"steam64": 76561199058918551
}
]
}
And I want to insert new values if the new value (inserted by user) is not already in the json, however when I try to search if the value is already there I receive a false value, an example of what I'm doing ( not the original code, only an example ):
import json
check = 76561198046619692
with open('banneds.json', 'r') as file:
data = json.load(file)
if check in data:
print(True)
else:
print(False)
I'm always receiving the "False" result, but the value is there, someone can give me a light of what I'm doing wrong please? I tried the entire night to find a solution, but no one works :(
Thanks for the help!
You are checking data as a dictionary object. When checking using if check in data it checks if data object have a key matching the value of the check variable (data.keys() to list all keys).
One easy way would be to use if check in data["players"].__str__() which will convert value to a string and search for the match.
If you want to make sure that check value only checks for the steam64 values, you can write a simple function that will iterate over all "players" and will check their "steam64" values. Another solution would be to make list of "steam64" values for faster and easier checking.
You can use any() to check if value of steam64 key is there.
For example:
import json
def check_value(data, val):
return any(player['steam64']==val for player in data['players'])
with open('banneds.json', 'r') as f_in:
data = json.load(f_in)
print(check_value(data, 76561198046619692))
Prints:
True

I cannot parse "country" and "name" with PARSE_JSON function in this very simple JSON object with SnowSQL

I'm ingesting a large simple json dataset from Azure Blob and moving data into a "stage" called "cities_stage" with FILE_FORMAT = json like so.
(Here is the error steps are below "Error parsing JSON: unknown keyword "Hurzuf", pos 7.")
create or replace stage cities_stage
url='azure://XXXXXXX.blob.core.windows.net/xxxx/landing/cities'
credentials=(azure_sas_token='?st=XXXXX&se=XXX&sp=racwdl&sv=XX&sr=c&sig=XXX')
FILE_FORMAT = (type = json);
I then take this stage location and dump it into a table with a single variant column like so. The file I'm ingesting is larger than 16mb so I create individual rows for each object by using type = json strip_outer_array = true
create or replace table cities_raw_source (
src variant);
copy into cities_raw_source
from #cities_stage
file_format = (type = json strip_outer_array = true)
on_error = continue;
When I select * from cities_raw_source each row looks like the following.
{
"coord": {
"lat": 44.549999,
"lon": 34.283333
},
"country": "UA",
"id": 707860,
"name": "Hurzuf"
}
When I add a reference to "country" or "name" that's where the issues come in. Here is my query (I did not use country in this one but it produces the same result).
select parse_json(src:id),
parse_json(src:coord:lat),
parse_json(src:coord:lon),
parse_json(src:name)
from cities_raw_source;
ERROR:
Error parsing JSON: unknown keyword "Hurzuf", pos 7.
ID, Lat, and Lon all come back as expected if I remove "src:name"
Any help is appreciated!
It turns out I had everything correct except for the query itself.
When querying a VARIANT column you do not need to PARSE_JSON so the correct query would look like this.
select src:id,
src:coord:lat,
src:coord:lon,
src:name
from cities_raw_source;

Parsing, Extracting & Returning JSON as Hash

I am trying to make a localized version of this app: SMS Broadcast Ruby App
I have been able to get the JSON data from a local file & sanitize the number as well as open the JSON data. However I have been unable to extract the values and pair them as a scrubbed hash. Here's what I have so far.
def data_from_spreadsheet
file = open(spreadsheet_url).read
JSON.parse(file)
end
def contacts_from_spreadsheet
contacts = {}
data_from_spreadsheet.each do |entry|
puts entry['name']['number']
contacts[sanitize(number)] = name
end
contacts
end
Here's the JSON data sample I'm working with.
[
{
"name": "Michael",
"number": 9045555555
},
{
"name": "Natalie",
"number": 7865555555
}
]
Here's how I would like the JSON to be expressed after the contacts_from_spreadsheet method.
{
'19045555555' => 'Michael',
'19045555555' => 'Natalie'
}
Any help would be much appreciated.
You could create array of pairs (hashes) using map and then call reduce to get a single hash.
data = [{
"name": "Michael",
"number": 9045555555
},
{
"name": "Natalie",
"number": 7865555555
}]
data.map{|e| {e[:number] => e[:name]}}.reduce Hash.new, :merge
Result: {9045555555=>"Michael", 7865555555=>"Natalie"}
You don't seem to have number or name extracted in any way. I think first you'll need to update your code to get those details.
i.e. If entry is a JSON object (or rather was before parsing), you can do the following:
def contacts_from_spreadsheet
contacts = {}
data_from_spreadsheet.each do |entry|
contacts[sanitize(entry['number'])] = entry['name']
end
contacts
end
Not really keeping this function within JSON, but I have solved the problem. Here's what I used.
def data_from_spreadsheet
file = open(spreadsheet_url).read
YAML.load(file)
end
def contacts_from_spreadsheet
contacts = {}
data_from_spreadsheet.each do |entry|
name = entry['name']
number = entry['phone_number'].to_s
contacts[sanitize(number)] = name
end
contacts
end
This returned back clean array here:
{"+19045555555"=>"Michael", "+17865555555"=>"Natalie"}
Thanks everyone who added input!

How to parse JSON value of a text column in cassandra

I have a column of text type be contain JSON value.
{
"customer": [
{
"details": {
"customer1": {
"name": "john",
"addresses": {
"address1": {
"line1": "xyz",
"line2": "pqr"
},
"address2": {
"line1": "abc",
"line2": "efg"
}
}
}
"customer2": {
"name": "robin",
"addresses": {
"address1": null
}
}
}
}
]
}
How can I extract 'address1' JSON field of column with query?
First I am trying to fetch JSON value then I will go with parsing.
SELECT JSON customer from text_column;
With my query, I get following error.
com.datastax.driver.core.exceptions.SyntaxError: line 1:12 no viable
alternative at input 'customer' (SELECT [JSON] customer...)
com.datastax.driver.core.exceptions.SyntaxError: line 1:12 no viable
alternative at input 'customer' (SELECT [JSON] customer...)
Cassandra version 2.1.13
You can't use SELECT JSON in Cassandra v2.1.x CQL v3.2.x
For Cassandra v2.1.x CQL v3.2.x :
The only supported operation after SELECT are :
DISTINCT
COUNT (*)
COUNT (1)
column_name AS new_name
WRITETIME (column_name)
TTL (column_name)
dateOf(), now(), minTimeuuid(), maxTimeuuid(), unixTimestampOf(), typeAsBlob() and blobAsType()
In Cassandra v2.2.x CQL v3.3.x Introduce : SELECT JSON
With SELECT statements, the new JSON keyword can be used to return each row as a single JSON encoded map. The remainder of the SELECT statment behavior is the same.
The result map keys are the same as the column names in a normal result set. For example, a statement like “SELECT JSON a, ttl(b) FROM ...” would result in a map with keys "a" and "ttl(b)". However, this is one notable exception: for symmetry with INSERT JSON behavior, case-sensitive column names with upper-case letters will be surrounded with double quotes. For example, “SELECT JSON myColumn FROM ...” would result in a map key "\"myColumn\"" (note the escaped quotes).
The map values will JSON-encoded representations (as described below) of the result set values.
If your Cassandra version is 2.1x and below, you can use the Python-based approach.
Write a python script using Cassandra-Python API
Here you have to get your row first and then use python json's loads method, which will convert your json text column value into JSON object which will be dict in Python. Then you can play around with Python dictionaries and extract your required nested keys. See the below code snippet.
from cassandra.cluster import Cluster
from cassandra.auth import PlainTextAuthProvider
import json
if __name__ == '__main__':
auth_provider = PlainTextAuthProvider(username='xxxx', password='xxxx')
cluster = Cluster(['0.0.0.0'],
port=9042, auth_provider=auth_provider)
session = cluster.connect("keyspace_name")
print("session created successfully")
rows = session.execute('select * from user limit 10')
for user_row in rows:
customer_dict = json.loads(user_row.customer)
print(customer_dict().keys()