Error trying to query NFIB SBET's REST API using Python's requests.post() - json

Essentially trying to get data as per the procedure outlined here: http://www.nfib-sbet.org/developers/
My code is as follows:
url = "http://open.api.nfib-sbet.org/rest/sbetdb/_proc/getIndicators"
data = {"app_name": "sbet",
"params": [
{"name": "minYear", "param_type": "IN", "value":1974},
{"name": "minMonth", "param_type": "IN", "value": 1},
{"name": "maxYear", "param_type": "IN", "value":datetime.datetime.today().year},
{"name": "maxMonth", "param_type": "IN", "value": datetime.datetime.today().month},
{"name": "indicator", "param_type": "IN","value": "OPT_INDEX,expand_employ,plan_capital,plan_invent,expected_bus_cond,expected_real_sales,invent,job_openings,expected_cred_cond,good_time_expand,past_earn"}
]
}
resp = requests.post(url,data=data)
print(resp.json())
The error I am getting:
{'error': [{'context': None, 'message': 'Failed to call database stored procedure.\nCDbCommand failed to execute the SQL statement: SQLSTATE[42000]: Syntax error or access violation: 1318 Incorrect number of arguments for PROCEDURE cube_survey.getIndicators; expected 5, got 1', 'code': 500}]}
I have tried to play with the format of my data dictionary and passing it to headers,json,params within requests.post(), but I always get error 400...
Any help appreciated!

Error looks to be from how you've formatted your last VALUE entry. Few things I noticed
Change the single quote (') to a double quote (")
Remove the use of backslash characters (/) in your string

Related

[AutodeskForge]I would like to know why the conversion fails ifc files contain single quotes at the end of certain attributes

I would like to know why the conversion fails if certain attributes contain trailing single quotes.
An ifc-file that contains single quotes (') at the end of certain attributes fails to translate.
The API I used
POST /modelderivative/v2/designdata/job
Target attribute value
EL-G35211AN/6-SP/L''
Data for the line containing the target attribute value in the ifc-file
#2340 = IFCFLOWTERMINAL('32hCanqHP0QAQmHARnRsKT', #2341, '\X2\30107167660E56685177FF0830E630FC30B630FC90E86750FF093011FF7CFF7DFF83FF9159294E95FF086D45578BFF09\X0\ EL-G35211AN/6-SP/L'', '\X2\6A5F566830FB56685177\X0\. \X2\30E630FC30B630FC90E86750\X0\', $, #2353, #2339, $);
#2346 = IFCLIGHTFIXTURETYPE('2A0TvRw8rCk8EKh4gObqVr', #2341, '\X2\30107167660E56685177FF0830E630FC30B630FC90E86750FF093011FF7CFF7DFF83FF9159294E95FF086D45578BFF09\X0\ EL-G35211AN/6-SP/L'', '\X2\6A5F566830FB56685177\X0\. \X2\30E630FC30B630FC90E86750\X0\', $, $, $, $, $, .NOTDEFINED.);#2390 = IFCPROPERTYSINGLEVALUE('\X2\540D79F0\X0\', '', IFCTEXT('\X2\FF7CFF7DFF83FF9159294E95FF086D45578BFF09\X0\ EL-G35211AN/6-SP/L''), $);
#2378 = IFCPROPERTYLISTVALUE('table_data', '', (IFCTEXT('SPVER,10,\X2\4ED569D866F830D030FC30B830E730F330B330FC30C9\X0\,'), IFCTEXT('CGRYCODE,403000000000000000,\X2\6A5F56685206985E30B330FC30C9\X0\,,'), IFCTEXT('NAME2,\X2\FF7CFF7DFF83FF9159294E95FF086D45578BFF09\X0\,\X2\578B5F0F540D79F0\X0\,'), IFCTEXT('NAME1,EL-G35211AN/6-SP/L'',\X2\30E130FC30AB30FC578B756A\X0\,')), $);
#2390 = IFCPROPERTYSINGLEVALUE('\X2\540D79F0\X0\', '', IFCTEXT('\X2\FF7CFF7DFF83FF9159294E95FF086D45578BFF09\X0\ EL-G35211AN/6-SP/L''), $);
#2398 = IFCPROPERTYSINGLEVALUE('\X2\578B756A\X0\', '', IFCTEXT('EL-G35211AN/6-SP/L''), $);
manifest
{
"urn": "XXXXXXXXXX",
"derivatives": [
{
"hasThumbnail": "false",
"name": "XXXXXXXX.ifc",
"progress": "complete",
"messages": [
{
"type": "error",
"code": "Navisworks-Internal",
"message": "Error code: 2 - bec9343c395241f266d3b7af86dee70e041d7687d3b4f0f0c0952b04ff3c39eb.ifc could not be opened because the contents are corrupt or it is currently unavailable.\n\nIt is recommended to re-open your model to avoid data loss.\n"
},
{
"type": "error",
"message": "Unrecoverable exit code from extractor: -1073741829",
"code": "TranslationWorker-InternalFailure"
}
],
"outputType": "svf2",
"status": "failed"
}
],
"hasThumbnail": "false",
"progress": "complete",
"type": "manifest",
"region": "US",
"version": "1.0",
"status": "failed"
}
I would like to know what is causing the inability to translate.
I'm not closely familiar with the IFC format but the following line from your question seems off:
#2340 = IFCFLOWTERMINAL('32...KT', #2341, '\X2\30...09\X0\ EL-G35211AN/6-SP/L'', '\X2\6A...50\X0\', $, #2353, #2339, $);
Since the format apparently uses single quotes to delimit string values, the two single quotes seem incorrect. Shouldn't the first quote be escaped?
Also, are you able to open this IFC file in any desktop IFC viewer? If you can, then I'd suggest that you send the IFC file to us via forge (dot) help (at) autodesk (dot) com (confidentially - we would not share the file with anyone outside of Autodesk), and we would ask our engineering to debug the conversion.

JSON query is not valid

I have the following query:
[{ "type": "url", "value": ["https://graph.microsoft.com/v1.0/users?$count=true&$search="displayName:room"&$filter=endsWith(mail,'microsoft.com')&$orderBy=displayName&$select=id,displayName,mail"]}]
On running the below code with input as the above query:
JToken.Parse(string query);
I see Invalid JSON query exception
I updated the query to use single quotes for 'displayName:room':
[{ "type": "url", "value": ["https://graph.microsoft.com/v1.0/users?$count=true&$search='displayName:room'&$filter=endsWith(mail,'microsoft.com')&$orderBy=displayName&$select=id,displayName,mail"]}]
With this change, I see Message: Syntax error: character ''' is not valid at position 0 in ''displayName:room'' on making graph call with the url. What am I missing?

Load JSON in BigQuery / JSON parsing error in row starting at position ... : Parser terminated before end of string

I'm trying to load a 350MB JSON file in BigQuery using Airflow GoogleCloudStorageToBigQueryOperator.
The job always stop at some position N (N never changes), with this error:
Error while reading data, error message: JSON parsing error in row starting at position 170468557: Parser terminated before end of string
I've searched for this line in the file, which goes like this:
{"active": true,
"currency": "USD",
"dangerous": "all",
"filing_reference": null,
"is_freight": false,
"max": NaN,
"min": 15.0,
"rate": 15.0,
"rate_unit": "teu",
"rates": [],
"rates_fixed": null,
"shipowner_id": "12",
"thresholds": [],
"transit_time": null,
"updated_at": 1566912641.0,
"validity_end": 1556582400.0,
"validity_start": 1554076800.0,
"via": "UNKNOWN"}
The BigQuery schema is generated from Postgres type values. This error is not clear at all, I'd appreciate some help!
We got the same error a few days ago, it comes from the NaN in your "max"
This type of JSON can be parsed with python for example, but when it comes to BigQuery, it throws an arror
=> Be sure to replace the NaNs to "null", it should work
Hope it helps !

JSON formatting Error when loading into Google Big Query

I am trying to load the following data in Big Query from PUBSUB using the built in dataflow template:
{
"current_speed": "19.09",
"_east": "-87.654561",
"_last_updt": "2018-07-17 15:50:54.0",
"_region_id": "1",
"_north": "42.026444",
"_south": "41.997946",
"region": "Rogers Park - West Ridge",
"_west": "-87.709645",
"_description": "North of Devon. Kedzie to Lake Shore"
}
But I keeping getting this error:
"Error while reading data, error message: Failed to parse JSON:
Unexpected end of string; Unexpected end of string; Expected key"
I actually need to load the larger dataset which looks like this:
[{
"current_speed": "19.09",
"_east": "-87.654561",
"_last_updt": "2018-07-17 15:50:54.0",
"_region_id": "1",
"_north": "42.026444",
"_south": "41.997946",
"region": "Rogers Park - West Ridge",
"_west": "-87.709645",
"_description": "North of Devon. Kedzie to Lake Shore"
}, {
"current_speed": "25.23",
"_east": "-87.747456",
"_last_updt": "2018-07-17 15:50:54.0",
"_region_id": "2",
"_north": "42.0190998",
"_south": "41.960669",
"region": "Far North West",
"_west": "-87.84621",
"_description": "North of Montrose. East River to Cicero"
}
]
But there I get this error:
Error while reading data, error message: Failed to parse JSON: No
object found when new array is started.; BeginArray returned false;
Parser terminated before end of string
What am I doing wrong here?
To convert JSON to new line delimited JSON (which is the format that BigQuery ingests) you can use jq:
$ cat a.json
[{
"key01": "value01",
"key02": "value02",
"keyN": "valueN"
},
{
"key01": "value01",
"key02": "value02",
"keyN": "valueN"
},
{
"key01": "value01",
"key02": "value02",
"keyN": "valueN"
}
]
$ cat a.json | jq -c '.[]'
{"key01":"value01","key02":"value02","keyN":"valueN"}
{"key01":"value01","key02":"value02","keyN":"valueN"}
{"key01":"value01","key02":"value02","keyN":"valueN"}
(see https://stackoverflow.com/a/51301075/132438)
Yes, BigQuery only accepts new-line delimited JSON, which means one complete JSON object per line. Before you merge the object to one line, BigQuery reads "{", which is start of an object, and expects to read a key, but the line ended, so you see the error message "expected key".
For multiple JSON objects, just put them one in each line. Don't enclose them inside an array. BigQuery expects each line to start with an object, "{". If you put "[" as the first character, you will see the second error message which means BigQuery reads an array but not inside an object.

TypeError: list indices must be integers or slices, not str - JSON, Python error

{
"gameId": 32,
"participantIdentities": [
{
"player": {
"id": "123",
"name": "xxx",
},
"participantId": 1
},
{
"player": {
"id": "123",
"name": "yyyy",
},
"participantId": 2
}
]
"gameDuration": 143,
}
I am trying to print names in this json file in python 3
list_id = []
for info in matchinfo['participantIdentities']['player']['name']:
list_id.append(info)
But I get the following error below
TypeError: list indices must be integers or slices, not str
How do I get the content of 'name'?
There are several issues:
You provided an invalid JSON. matchinfo['participantIdentities'] should be a list but the JSON you provided is missing a closing ]
matchinfo['participantIdentities'] is a list, so you should either provide an index (matchinfo['participantIdentities'][0]['player']['summonerId'] for example) or iterate over all matchinfo['participantIdentities'] entries.
You are trying to access a key that doesn't even exist (at least in the JSON you provided). There is no 'summonerId' key anywhere.