Duplicating certain elements in a nested list - duplicates

I have a list like this:
a = [[['0', '1'], 1.0],
[['0', '1'], 1.0],
[['3', '11', '8'], 1.0],
[['7', '5'], 1.0],
[['7', '5'], 1.0],
[['3', '11', '8'], 1.0],
[['3', '11', '8'], 1.0],
[['29', '30', '27', '28'], 0.95703125],
[['29', '30', '27', '28'], 0.96875],
[['29', '30', '27', '28'], 0.98046875],
[['29', '30', '27', '28'], 0.98046875]]
And i want to remove the duplicates in such a way that i only remove the list which has different numbers irrespective of the float values so my output should look like:
a = [[['0', '1'], 1.0],
[['3', '11', '8'], 1.0],
[['7', '5'], 1.0],
[['29', '30', '27', '28'], 0.98046875]]
Can it be possible?

Related

BigQuery json function - cannot extract all values if json string not well formatted

I have a json string stored in a field in BigQuery which has this structure::
{'language': 'Eng', 'date_started': '2021-02-08 16: 56: 55 GMT', 'link_id': '111', 'url_variables': {'touchpoint': {'key': 'touchpoint', 'value': 'phone', 'type': 'url'
}, 'interaction_id': {'key': 'interaction_id', 'value': '111', 'type': 'url'
}
}, 'ip_address': None, 'referer': '', 'user_agent': None, 'response_time': 111, 'data_quality': [], 'longitude': '', 'latitude': '', 'country': '', 'city': '', 'region': '', 'postal': '', 'dma': '', 'survey_data': {'25': {'id': 25, 'type': 'TEXTBOX', 'question': 'feedback_source', 'section_id': 1, 'shown': False
}, '229': {'id': 229, 'type': 'TEXTBOX', 'question': 'recruitment_method', 'section_id': 1, 'shown': False
}, '227': {'id': 227, 'type': 'TEXTBOX', 'question': 'meeting_point', 'section_id': 1, 'answer': 'phone', 'shown': True
}, '221': {'id': 221, 'type': 'TEXTBOX', 'question': 'interaction_id', 'section_id': 1, 'answer': '222', 'shown': True
}, '217': {'id': 217, 'type': 'TEXTBOX', 'question': 'session_id', 'section_id': 1, 'answer': '333', 'shown': True
}, '231': {'id': 231, 'type': 'ESSAY', 'question': 'BlaBla question 4', 'section_id': 3, 'answer': 'Bla Bla answer', 'shown': True
}, '255': {'id': 255, 'type': 'TEXTBOX', 'question': 'tz_offset', 'section_id': 3, 'answer': '-120', 'shown': True
}, '77': {'id': 77, 'type': 'parent', 'question': 'Bla Bla 1', 'section_id': 35, 'options': {'10395': {'id': 10395, 'option': 'Neutraal', 'answer': '3'
}
}, 'shown': True
}, '250': {'id': 250, 'type': 'RADIO', 'question': 'Bla Bla?', 'section_id': 66, 'original_answer': '1', 'answer': '1', 'answer_id': 10860, 'shown': True
}, '251': {'id': 251, 'type': 'RADIO', 'question': 'Bla Bla', 'section_id': 66, 'original_answer': '0', 'answer': '0', 'answer_id': 10863, 'shown': True
}
}
}
I'm able to extract some of the values with the query below, but I cannot extract response_time or any of the values inside the survey_data structure.
They always come out as null.
DECLARE resp STRING
DEFAULT "{'id': '111', 'contact_id': '', 'status': 'Complete', 'is_test_data': '0', 'date_submitted': '2021-07-08 17: 02: 16 GMT', 'session_id': '111', 'language': 'Eng', 'date_started': '2021-02-08 16: 56: 55 GMT', 'link_id': '111', 'url_variables': {'touchpoint': {'key': 'touchpoint', 'value': 'phone', 'type': 'url' }, 'interaction_id': {'key': 'interaction_id', 'value': '111', 'type': 'url' } }, 'ip_address': None, 'referer': '', 'user_agent': None, 'response_time': 111, 'data_quality': [], 'longitude': '', 'latitude': '', 'country': '', 'city': '', 'region': '', 'postal': '', 'dma': '', 'survey_data': {'25': {'id': 25, 'type': 'TEXTBOX', 'question': 'feedback_source', 'section_id': 1, 'shown': False }, '229': {'id': 229, 'type': 'TEXTBOX', 'question': 'recruitment_method', 'section_id': 1, 'shown': False }, '227': {'id': 227, 'type': 'TEXTBOX', 'question': 'meeting_point', 'section_id': 1, 'answer': 'phone', 'shown': True }, '221': {'id': 221, 'type': 'TEXTBOX', 'question': 'interaction_id', 'section_id': 1, 'answer': '222', 'shown': True }, '217': {'id': 217, 'type': 'TEXTBOX', 'question': 'session_id', 'section_id': 1, 'answer': '333', 'shown': True }, '231': {'id': 231, 'type': 'ESSAY', 'question': 'BlaBla question 4', 'section_id': 3, 'answer': 'Bla Bla answer', 'shown': True }, '255': {'id': 255, 'type': 'TEXTBOX', 'question': 'tz_offset', 'section_id': 3, 'answer': '-120', 'shown': True }, '77': {'id': 77, 'type': 'parent', 'question': 'Bla Bla 1', 'section_id': 35, 'options': {'10395': {'id': 10395, 'option': 'Neutraal', 'answer': '3' } }, 'shown': True }, '250': {'id': 250, 'type': 'RADIO', 'question': 'Bla Bla?', 'section_id': 66, 'original_answer': '1', 'answer': '1', 'answer_id': 10860, 'shown': True }, '251': {'id': 251, 'type': 'RADIO', 'question': 'Bla Bla', 'section_id': 66, 'original_answer': '0', 'answer': '0', 'answer_id': 10863, 'shown': True } } }";
SELECT
JSON_VALUE( resp, '$.url_variables.interaction_id.value') as url_interaction_id_value ,
  JSON_VALUE( resp, '$.url_variables.interaction_id.type') as url_interaction_id_type,
  JSON_VALUE( resp, '$.language') as language,
JSON_QUERY( resp, '$.response_time') as response_time, -- NOT WORKING
JSON_QUERY( resp, '$.survey_data') as survey_data -- NOT WORKING
I tried with jq in bash from the CLI and it seems to complain about the fact that some of the None values are not quoted.
Question:
Does it mean that BigQuery attempts to extract values from the JSON string as far as it can, "until" it encounters something that it is not well formatted (e.g. the unquoted None values) and then it just cannot parse further and returns nulls ?
NB: In another app, I have been able to parse the json file in Python and extract values from inside the json string.
Looks like you have few formatting issues with your resp field which you can fix with few REPLACEs as in below example
SELECT
JSON_VALUE( resp, '$.url_variables.interaction_id.value') as url_interaction_id_value ,
JSON_VALUE( resp, '$.url_variables.interaction_id.type') as url_interaction_id_type,
JSON_VALUE( resp, '$.language') as language,
JSON_QUERY( resp, '$.response_time') as response_time, -- WORKING NOW
JSON_QUERY( resp, '$.survey_data') as survey_data -- WORKING NOW,
FROM (
SELECT REPLACE(REPLACE(REPLACE(resp, "None,", "'None',"), "True", "true"), "False", "false") as resp
FROM `project.dataset.table`
)
if applied to sample data in your question - now it gets you all you need

Part of Foursquare API response missing when using VSCODE

I'm trying to create a Jupyter Notebook project using VSCode. In this project I'm using Foursquare API to query venues around Seattle. However I've been running to a little problem when the notebook outputs JSON file response. I keep getting the response with only the second part of the JSON file when I set the parameter LIMIT to more than 20 venues.
Let me just be clear.
This is only happening with VScode. When I input the same lines of code in a Jupyter Notebook using the browser and setting the parameter LIMIT to more than 20, I get the full JSON file response.
Below is the code sample used in VSCode:
Imported libraries:
import pandas as pd
import requests
import json
from geopy.geocoders import Nominatim
from pandas.io.json import json_normalize
pd.set_option('display.max_rows', None)
pd.set_option('display.max_columns', None)
GEOPY.GEOCODERS to retrive Seattle coordinates.
address = 'Seattle, WA'
geolocator = Nominatim(user_agent="foursquare")
location = geolocator.geocode(address)
latitude = location.latitude
longitude = location.longitude
print(latitude, longitude)
Variables with credentials and parameters for the API request.(Note how the limit is set to 21)
CLIENT_ID = 'client_id' # your Foursquare ID
CLIENT_SECRET = 'client_secret' # your Foursquare Secret
VERSION = '20180604'
LIMIT = 21
RADIUS = 500
URL for the API request.
url = 'https://api.foursquare.com/v2/venues/search?client_id={}&client_secret={}&ll={},{}&v={}&radius={}&limit={}'.format(CLIENT_ID, CLIENT_SECRET, latitude, longitude, VERSION, RADIUS, LIMIT)
The API request.
results = requests.get(url).json()
results
The undiserable output: (Notice how the meta data and the first part of file is missing)
['600 4th Ave (5th & Cherry)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d129941735',
'name': 'City Hall',
'pluralName': 'City Halls',
'shortName': 'City Hall',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/cityhall_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False},
{'id': '4b49098df964a520286326e3',
'name': 'Seattle City Hall',
'location': {'address': '600 4th Ave',
'crossStreet': 'btwn Cherry & James',
'lat': 47.60391791602839,
'lng': -122.32999464587043,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60391791602839,
'lng': -122.32999464587043}],
'distance': 10,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['600 4th Ave (btwn Cherry & James)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d129941735',
'name': 'City Hall',
'pluralName': 'City Halls',
'shortName': 'City Hall',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/cityhall_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False},
{'id': '4e44510fe4cd394059e89099',
'name': 'Karr Tuttle Campbell',
'location': {'address': '701 5th Ave',
'lat': 47.60440702245942,
'lng': -122.33136024826479,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60440702245942,
'lng': -122.33136024826479}],
'distance': 116,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['701 5th Ave',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d124941735',
'name': 'Office',
'pluralName': 'Offices',
'shortName': 'Office',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/default_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False},
{'id': '4c3f616ed691c9b6d8a6890a',
'name': 'City Hall Plaza',
'location': {'address': '600 4th Ave',
'crossStreet': '4th & Cherry',
'lat': 47.60378595075962,
'lng': -122.33051066366723,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60378595075962,
'lng': -122.33051066366723}],
'distance': 34,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['600 4th Ave (4th & Cherry)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d129941735',
'name': 'City Hall',
'pluralName': 'City Halls',
'shortName': 'City Hall',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/cityhall_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False},
{'id': '4ddbc954091aae6b185e2968',
'name': 'Seattle Municipal Tower 44th Floor',
'location': {'address': '700 5th Ave',
'crossStreet': 'Cherry Street',
'lat': 47.60509704657094,
'lng': -122.3301267150704,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60509704657094,
'lng': -122.3301267150704}],
'distance': 140,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['700 5th Ave (Cherry Street)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d124941735',
'name': 'Office',
'pluralName': 'Offices',
'shortName': 'Office',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/default_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False},
{'id': '598b268efc9e9467c3d48589',
'name': 'Bertha Knight Landes Conference Room',
'location': {'lat': 47.603764,
'lng': -122.32945,
'labeledLatLngs': [{'label': 'display',
'lat': 47.603764,
'lng': -122.32945}],
'distance': 46,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['Seattle, WA 98104', 'United States']},
'categories': [{'id': '4bf58dd8d48988d127941735',
'name': 'Conference Room',
'pluralName': 'Conference Rooms',
'shortName': 'Conference room',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/office_conferenceroom_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False},
{'id': '4a9e9437f964a5207c3a20e3',
'name': 'Einstein Bros Bagels',
'location': {'address': '600 4th Ave',
'lat': 47.60389534060459,
'lng': -122.33065690674596,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60389534060459,
'lng': -122.33065690674596}],
'distance': 45,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['600 4th Ave',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d179941735',
'name': 'Bagel Shop',
'pluralName': 'Bagel Shops',
'shortName': 'Bagels',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/food/bagels_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False},
{'id': '427ab380f964a52096211fe3',
'name': 'Columbia Center',
'location': {'address': '701 5th Ave',
'crossStreet': 'at Columbia St',
'lat': 47.60452412230289,
'lng': -122.33075151763909,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60452412230289,
'lng': -122.33075151763909},
{'label': 'entrance', 'lat': 47.604432, 'lng': -122.330763}],
'distance': 92,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['701 5th Ave (at Columbia St)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d130941735',
'name': 'Building',
'pluralName': 'Buildings',
'shortName': 'Building',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/default_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749016',
'hasPerk': False}
........ CONTINUED RESULTS INTENTIONALLY DELETED
Running the code with LIMIT parameter set to 20 which gives me the desired output.
CLIENT_ID = 'client_id'
CLIENT_SECRET = 'client_secret'
VERSION = '20180604'
LIMIT = 20
RADIUS = 500
Again, the url for the request. (No changes here)
url = 'https://api.foursquare.com/v2/venues/search?client_id={}&client_secret={}&ll={},{}&v={}&radius={}&limit={}'.format(CLIENT_ID, CLIENT_SECRET, latitude, longitude, VERSION, RADIUS, LIMIT)
url
One more time, the API request.(No changes)
results = requests.get(url).json()
results
Finally the desired output with the full JSON file:
{'meta': {'code': 200, 'requestId': '606ce68a749e75020fe96e3a'},
'response': {'venues': [{'id': '4c3b9d165810a593aff7ba3c',
'name': 'City Council Chambers',
'location': {'address': '600 4th Ave',
'crossStreet': '5th & Cherry',
'lat': 47.603861440975066,
'lng': -122.33006802191612,
'labeledLatLngs': [{'label': 'display',
'lat': 47.603861440975066,
'lng': -122.33006802191612},
{'label': 'entrance', 'lat': 47.603626, 'lng': -122.329618}],
'distance': 3,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['600 4th Ave (5th & Cherry)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d129941735',
'name': 'City Hall',
'pluralName': 'City Halls',
'shortName': 'City Hall',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/cityhall_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749643',
'hasPerk': False},
{'id': '4b49098df964a520286326e3',
'name': 'Seattle City Hall',
'location': {'address': '600 4th Ave',
'crossStreet': 'btwn Cherry & James',
'lat': 47.60391791602839,
'lng': -122.32999464587043,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60391791602839,
'lng': -122.32999464587043}],
'distance': 10,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['600 4th Ave (btwn Cherry & James)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d129941735',
'name': 'City Hall',
'pluralName': 'City Halls',
'shortName': 'City Hall',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/cityhall_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617749643',
'hasPerk': False},
............ CONTINUED RESULTS INTENTIONALLY DELETED
Progress update.
Apparently I'm fetching the full response file but VSCode is not displaying the whole output.
Below is a sample code from the same request with LIMIT parameter set to 30. The evidence of the full file being fetched but no displayed is shown when I slice the file to see only the first 20 venues.
Somehow I can even increase it to 22. Yet if I pass this threshold the problem reappears again.
Input:
results['response']['venues'][0:22]
Output:
[{'id': '4c3b9d165810a593aff7ba3c',
'name': 'City Council Chambers',
'location': {'address': '600 4th Ave',
'crossStreet': '5th & Cherry',
'lat': 47.603861440975066,
'lng': -122.33006802191612,
'labeledLatLngs': [{'label': 'display',
'lat': 47.603861440975066,
'lng': -122.33006802191612},
{'label': 'entrance', 'lat': 47.603626, 'lng': -122.329618}],
'distance': 3,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['600 4th Ave (5th & Cherry)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d129941735',
'name': 'City Hall',
'pluralName': 'City Halls',
'shortName': 'City Hall',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/cityhall_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617761189',
'hasPerk': False},
{'id': '4b49098df964a520286326e3',
'name': 'Seattle City Hall',
'location': {'address': '600 4th Ave',
'crossStreet': 'btwn Cherry & James',
'lat': 47.60391791602839,
'lng': -122.32999464587043,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60391791602839,
'lng': -122.32999464587043}],
'distance': 10,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['600 4th Ave (btwn Cherry & James)',
'Seattle, WA 98104',
'United States']},
'categories': [{'id': '4bf58dd8d48988d129941735',
'name': 'City Hall',
'pluralName': 'City Halls',
'shortName': 'City Hall',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/cityhall_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617761189',
'hasPerk': False},
{'id': '4e44510fe4cd394059e89099',
'name': 'Karr Tuttle Campbell',
'location': {'address': '701 5th Ave',
'lat': 47.60440702245942,
'lng': -122.33136024826479,
'labeledLatLngs': [{'label': 'display',
'lat': 47.60440702245942,
'lng': -122.33136024826479}],
'distance': 116,
'postalCode': '98104',
'cc': 'US',
'city': 'Seattle',
'state': 'WA',
'country': 'United States',
'formattedAddress': ['701 5th Ave', 'Seattle, WA 98104', 'United States']},
'categories': [{'id': '4bf58dd8d48988d124941735',
'name': 'Office',
'pluralName': 'Offices',
'shortName': 'Office',
'icon': {'prefix': 'https://ss3.4sqi.net/img/categories_v2/building/default_',
'suffix': '.png'},
'primary': True}],
'referralId': 'v-1617761189',
'hasPerk': False}
It Looks like I found the solution to my problem.
Thank you everyone for all your diligent attention in this serious matter.
Here it goes the instructions:
Go to VScode Settings > In the search box type: 'text output' > Set the text output limit to 0
I hope this helps the next rookie in line.

in Python 3, how can I slice JSON data where objects all start with same name?

I have a JSON string that returns device info and if devices are found, the devices will be listed as device0, device1, device2, etc. In this simple code below, how can I discover all devices found in the JSON and then print the the info below for each device? I currently lookup each device statically and I want this discovery to be dynamic and print the results for each one found.
r1 = requests.get(url = url_api, params = PARAMS)
devicedata = r1.json()
if 'device0' in devicedata:
print('')
device0Name = (devicedata['device0']['device_name'])
print(device0Name)
print('Temp: {}'.format (devicedata['device0']['obs'][0]['ambient_temp']))
print('Probe Temp: {}'.format (devicedata['device0']['obs'][0]['probe_temp']))
print('Humidity: {}%'.format (devicedata['device0']['obs'][0]['humidity']))
print('')
# JSON info looks like this...
{'device0': {'success': True, 'device_type': 'TX60', 'obs': [{'device_id': '1111', 'device_type': 'TX60', 'u_timestamp': '1580361017', 'ambient_temp': '45.7', 'probe_temp': '45.5', 'humidity': '82', 'linkquality': '100', 'lowbattery': '0', 'success': '9', 's_interval': '99', 'timestamp': '1/29/2020 11:10 PM', 'utctime': 1580361017}], 'alerts': {'miss': {'id': '520831', 'alert_type': 'miss', 's_id': '1111', 'max': '-100', 'min': '30', 'wet': '0', 'alert_id': '1', 'phone': 'yes', 'email': '', 'state': None}, 'batt': {'id': '520832', 'alert_type': 'batt', 's_id': '1111', 'max': '-100', 'min': '-100', 'wet': '0', 'alert_id': '1', 'phone': 'yes', 'email': '', 'state': None}}, 'ispws': 0, 'unit': {'temp': '°F', 'temp2': '°F', 'rh': '%'}, 'device_id': '1111', 'expired': '0', 'interval': '30', 'reg_date': '2020-01-17 22:06:48', 'create_date': 1579298808, 'device_name': 'Back Yard', 'assocGateway': '1', 'problem': False}, 'device1': {'success': True, 'device_type': 'TX60', 'obs': [{'device_id': '2222', 'device_type': 'TX60', 'u_timestamp': '1580360303', 'ambient_temp': '63.6', 'probe_temp': 'N/C', 'humidity': '64', 'linkquality': '100', 'lowbattery': '0', 'success': '9', 's_interval': '99', 'timestamp': '1/29/2020 10:58 PM', 'utctime': 1580360303}], 'alerts': {'miss': {'id': '520220', 'alert_type': 'miss', 's_id': '2222', 'max': '-100', 'min': '30', 'wet': '0', 'alert_id': '1', 'phone': 'yes', 'email': '', 'state': None}, 'batt': {'id': '520221', 'alert_type': 'batt', 's_id': '2222', 'max': '-100', 'min': '-100', 'wet': '0', 'alert_id': '1', 'phone': 'yes', 'email': '', 'state': None}}, 'ispws': 0, 'unit': {'temp': '°F', 'temp2': '°F', 'rh': '%'}, 'device_id': '3333', 'expired': '1', 'interval': '30', 'reg_date': '2016-03-19 01:45:04', 'create_date': 1500868369, 'device_name': 'Crawl Space', 'assocGateway': '1', 'problem': False}, 'device2': {'success': True, 'device_type': 'TX60', 'obs': [{'device_id': '3333', 'device_type': 'TX60', 'u_timestamp': '1580360195', 'ambient_temp': '70.2', 'probe_temp': 'N/C', 'humidity': '48', 'linkquality': '100', 'lowbattery': '0', 'success': '9', 's_interval': '99', 'timestamp': '1/29/2020 10:56 PM', 'utctime': 1580360195}], 'alerts': None, 'ispws': 0, 'unit': {'temp': '°F', 'temp2': '°F', 'rh': '%'}, 'device_id': '3333', 'expired': '0', 'interval': '15', 'reg_date': '2020-01-30 04:34:00', 'create_date': 1580358840, 'device_name': 'Basement', 'assocGateway': '2', 'problem': False}, 'tz': 'America/Chicago'}
The output for a single device looks like this..
Back Yard
Temp: 50.9
Probe Temp: 51.2
Humidity: 92%
Crawl Space
Temp: 65.4
Probe Temp: N/C
Humidity: 55%
Basement
Temp: 70
Probe Temp: N/C
Humidity: 48%
Found it.
for devKey in devicedata.keys():
if "device" in devKey:
dev = devicedata[devKey]
name = dev["device_name"]
obs = dev["obs"][0]
temp = obs["ambient_temp"]
probeTemp = obs["probe_temp"]
humidity = obs["humidity"]
print(name)
print('Temp: {}'.format(temp))
print('Probe Temp: {}'.format(probeTemp))
print('Humidity: {}%'.format(humidity))
print('')

List of languages in JSON format (language as a key and code as a value) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
If I just can get the list of languages in following JSON format:
{
"english": "en",
"german": "de",
"greek": "el",
}
Thanks.
{'Abkhaz': 'ab',
'Afar': 'aa',
'Afrikaans': 'af',
'Akan': 'ak',
'Albanian': 'sq',
'Amharic': 'am',
'Arabic': 'ar',
'Aragonese': 'an',
'Armenian': 'hy',
'Assamese': 'as',
'Avaric': 'av',
'Avestan': 'ae',
'Aymara': 'ay',
'Azerbaijani': 'az',
'Bambara': 'bm',
'Bashkir': 'ba',
'Basque': 'eu',
'Belarusian': 'be',
'Bengali': 'bn',
'Bihari': 'bh',
'Bislama': 'bi',
'Bosnian': 'bs',
'Breton': 'br',
'Bulgarian': 'bg',
'Burmese': 'my',
'Catalan; Valencian': 'ca',
'Chamorro': 'ch',
'Chechen': 'ce',
'Chichewa; Chewa; Nyanja': 'ny',
'Chinese': 'zh',
'Chuvash': 'cv',
'Cornish': 'kw',
'Corsican': 'co',
'Cree': 'cr',
'Croatian': 'hr',
'Czech': 'cs',
'Danish': 'da',
'Divehi; Dhivehi; Maldivian;': 'dv',
'Dutch': 'nl',
'English': 'en',
'Esperanto': 'eo',
'Estonian': 'et',
'Ewe': 'ee',
'Faroese': 'fo',
'Fijian': 'fj',
'Finnish': 'fi',
'French': 'fr',
'Fula; Fulah; Pulaar; Pular': 'ff',
'Galician': 'gl',
'Georgian': 'ka',
'German': 'de',
'Greek, Modern': 'el',
'Guaraní': 'gn',
'Gujarati': 'gu',
'Haitian; Haitian Creole': 'ht',
'Hausa': 'ha',
'Hebrew (modern)': 'he',
'Herero': 'hz',
'Hindi': 'hi',
'Hiri Motu': 'ho',
'Hungarian': 'hu',
'Interlingua': 'ia',
'Indonesian': 'id',
'Interlingue': 'ie',
'Irish': 'ga',
'Igbo': 'ig',
'Inupiaq': 'ik',
'Ido': 'io',
'Icelandic': 'is',
'Italian': 'it',
'Inuktitut': 'iu',
'Japanese': 'ja',
'Javanese': 'jv',
'Kalaallisut, Greenlandic': 'kl',
'Kannada': 'kn',
'Kanuri': 'kr',
'Kashmiri': 'ks',
'Kazakh': 'kk',
'Khmer': 'km',
'Kikuyu, Gikuyu': 'ki',
'Kinyarwanda': 'rw',
'Kirghiz, Kyrgyz': 'ky',
'Komi': 'kv',
'Kongo': 'kg',
'Korean': 'ko',
'Kurdish': 'ku',
'Kwanyama, Kuanyama': 'kj',
'Latin': 'la',
'Luxembourgish, Letzeburgesch': 'lb',
'Luganda': 'lg',
'Limburgish, Limburgan, Limburger': 'li',
'Lingala': 'ln',
'Lao': 'lo',
'Lithuanian': 'lt',
'Luba-Katanga': 'lu',
'Latvian': 'lv',
'Manx': 'gv',
'Macedonian': 'mk',
'Malagasy': 'mg',
'Malay': 'ms',
'Malayalam': 'ml',
'Maltese': 'mt',
'Māori': 'mi',
'Marathi (Marāṭhī)': 'mr',
'Marshallese': 'mh',
'Mongolian': 'mn',
'Nauru': 'na',
'Navajo, Navaho': 'nv',
'Norwegian Bokmål': 'nb',
'North Ndebele': 'nd',
'Nepali': 'ne',
'Ndonga': 'ng',
'Norwegian Nynorsk': 'nn',
'Norwegian': 'no',
'Nuosu': 'ii',
'South Ndebele': 'nr',
'Occitan': 'oc',
'Ojibwe, Ojibwa': 'oj',
'Old Church Slavonic, Church Slavic, Church Slavonic, Old Bulgarian, Old Slavonic': 'cu',
'Oromo': 'om',
'Oriya': 'or',
'Ossetian, Ossetic': 'os',
'Panjabi, Punjabi': 'pa',
'Pāli': 'pi',
'Persian': 'fa',
'Polish': 'pl',
'Pashto, Pushto': 'ps',
'Portuguese': 'pt',
'Quechua': 'qu',
'Romansh': 'rm',
'Kirundi': 'rn',
'Romanian, Moldavian, Moldovan': 'ro',
'Russian': 'ru',
'Sanskrit (Saṁskṛta)': 'sa',
'Sardinian': 'sc',
'Sindhi': 'sd',
'Northern Sami': 'se',
'Samoan': 'sm',
'Sango': 'sg',
'Serbian': 'sr',
'Scottish Gaelic; Gaelic': 'gd',
'Shona': 'sn',
'Sinhala, Sinhalese': 'si',
'Slovak': 'sk',
'Slovene': 'sl',
'Somali': 'so',
'Southern Sotho': 'st',
'Spanish; Castilian': 'es',
'Sundanese': 'su',
'Swahili': 'sw',
'Swati': 'ss',
'Swedish': 'sv',
'Tamil': 'ta',
'Telugu': 'te',
'Tajik': 'tg',
'Thai': 'th',
'Tigrinya': 'ti',
'Tibetan Standard, Tibetan, Central': 'bo',
'Turkmen': 'tk',
'Tagalog': 'tl',
'Tswana': 'tn',
'Tonga (Tonga Islands)': 'to',
'Turkish': 'tr',
'Tsonga': 'ts',
'Tatar': 'tt',
'Twi': 'tw',
'Tahitian': 'ty',
'Uighur, Uyghur': 'ug',
'Ukrainian': 'uk',
'Urdu': 'ur',
'Uzbek': 'uz',
'Venda': 've',
'Vietnamese': 'vi',
'Volapük': 'vo',
'Walloon': 'wa',
'Welsh': 'cy',
'Wolof': 'wo',
'Western Frisian': 'fy',
'Xhosa': 'xh',
'Yiddish': 'yi',
'Yoruba': 'yo',
'Zhuang, Chuang': 'za'}

PDO insert into MySQL Database

I'm trying to insert a parsed table into a mysql database. But the table within the database stays empty. None of the values are inserted.
I already checked this PDO with INSERT INTO through prepared statements and tried to apply the answers to my case. But I'm still getting trouble. What am I missing?
This is my code so far:
// Simple Dom
require_once('simple_html_dom.php');
require_once 'config.php';
$host=$config['DB_HOST'];
$dbname=$config['DB_DATABASE'];
// Get Connection to HTML and DATABASE
$html = file_get_html('url');
$pdo = new PDO("mysql:host=$host;dbname=$dbname",$config['DB_USERNAME'],$config['DB_PASSWORD']);
// Find TABLE INSIDE HTML
$table = $html->find('table', 1);
// find CELLS of each ROW, starting at 2nd row
foreach($table ->find('tr') as $rowNumber => $row) { if ( $rowNumber < 1 ) continue;
$team = "Timberwolves";
$season = "9";
// Get all columns of the table
$pos = $row->find('td', 0);
$player = $row->find('td', 1)->plaintext;
$age = $row->find('td', 2);
$twoga = $row->find('td', 3);
$twopct = $row->find('td', 4);
$fta = $row->find('td', 5);
$ftpct = $row->find('td', 6);
$threega = $row->find('td', 7);
$threepct = $row->find('td', 8);
$orb = $row->find('td', 9);
$drb = $row->find('td', 10);
$ast = $row->find('td', 11);
$stl = $row->find('td', 12);
$tov = $row->find('td', 13);
$blk = $row->find('td', 14);
$oo = $row->find('td', 15);
$do = $row->find('td', 16);
$po = $row->find('td', 17);
$to = $row->find('td', 18);
$od = $row->find('td', 19);
$dd = $row->find('td', 20);
$pd = $row->find('td', 21);
$td = $row->find('td', 22);
// Echo some of the found values to test the code
echo "'$season', '$team', '$pos', '$player', '$age', '$twoga', '$twopct','$fta','$ftpct','$threega', '$threepct', '$orb','$drb','$ast','$stl','$tov','$blk','$oo','$do','$po','$to','$od','$dd','$pd','$td')<br>";
// INSERT for each row
$statement = $pdo->prepare("INSERT INTO `teams` (season,team,pos,player,age,2ga,2g%,fta,ft%,3ga,3g%,orb,drb,ast,stl,tov,blk,oo,do,po,to,od,dd,pd,td)
VALUES (:season,:team,:pos,:player,:age,:twoga,:twopct,:fta,:ftpct,:threega,:threepct,:orb,:drb,:ast,:stl,:tov,:blk,:oo,:do,:po,:to,:od,:dd,:pd,:td)");
$statement->execute(array(":season"=>$season,":team"=>$team,":pos"=>$pos,":player"=>$player,":age"=>$age,":twoga"=>$twoga,":twopct"=>$twopct,":fta"=>$fta,":ftpct"=>$ftpct,
":threega"=>$threega,":threepct"=>$threepct,":orb"=>$orb,":drb"=>$drb,":ast"=>$ast,":stl"=>$stl,":tov"=>$tov,":blk"=>$blk,":oo"=>$oo,":do"=>$do,":po"=>$po,":to"=>$to,":od"=>$od,":dd"=>$dd,":pd"=>$pd,":td"=>$td));
// END OF LOOP
}
The table-parsing works fine in my eyes. I'll get this as output:
'9', 'Timberwolves', 'PG', 'Mike Conley ', '29', '47', '50','51','86','61', '41', '8','28','61','56','58','8','8','8','6','7','6','6','2','7')
'9', 'Timberwolves', 'PG', 'Steve Blake ', '35', '15', '46','6','80','47', '34', '5','23','60','31','52','4','7','6','1','4','4','4','1','5')
'9', 'Timberwolves', 'SG', 'Isaiah Thomas ', '27', '59', '53','80','91','83', '38', '10','19','56','37','49','4','7','9','8','9','5','4','2','6')
'9', 'Timberwolves', 'SF', 'Rondae Hollis-Jefferson ', '22', '48', '46','45','75','12', '22', '34','62','27','64','60','25','3','5','7','3','8','6','7','6')
'9', 'Timberwolves', 'SF', 'Tony Snell ', '25', '14', '55','7','81','50', '41', '6','30','13','32','86','6','9','2','3','1','5','4','2','6')
'9', 'Timberwolves', 'PF', 'Joel Embiid ', '22', '77', '50','99','78','41', '37', '49','72','27','48','6','98','3','7','9','6','9','3','9','4')
'9', 'Timberwolves', 'PF', 'Draymond Green ', '26', '28', '49','27','71','35', '31', '25','63','69','87','53','43','5','7','4','4','9','7','7','8')
'9', 'Timberwolves', ' C', 'Marc Gasol ', '32', '64', '48','41','84','35', '39', '15','49','43','37','59','40','7','6','5','4','6','4','5','5')
'9', 'Timberwolves', ' C', 'Marreese Speights ', '29', '35', '52','31','88','65', '37', '40','63','15','23','70','29','7','4','7','1','5','1','6','2')
'9', 'Timberwolves', 'SF', 'Dante Cunningham ', '29', '19', '58','5','59','36', '39', '21','41','7','33','90','17','8','2','4','1','5','4','4','5')
'9', 'Timberwolves', ' C', 'Boban Marjanovic ', '28', '69', '55','53','81','0', '0', '84','74','8','24','82','35','4','5','9','2','9','4','9','4')
'9', 'Timberwolves', 'SF', 'Rasual Butler ', '36', '19', '62','10','69','32', '31', '3','32','15','36','91','46','7','3','4','1','8','6','5','6')
This is how the DB-table looks like: