How to import Google Maps API into PostgreSQL? - json

I am trying to transfer data from a JSON file produced by the Google Maps API onto my PostgreSQL database. This is done through cURL and I made sure that the permissions have been correctly set.
The url:
https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=London&destinations=Paris&key=AIza-[key-redacted]-3z6ho-o
The query:
copy bookings.import(info) from program 'C:/temp/mycurl/curl "https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=London&destinations=Paris&key=AIzaSyBIhOMI68hTIFarH4jrb_eKUmvY3z6ho-o" --insecure'
However, when I try to do this on my table with column 'info' of type 'json', I get the following error:
ERROR: invalid input syntax for type json DETAIL: The input string
ended unexpectedly. CONTEXT: JSON data, line 1: { COPY import, line
1, column info: "{"
********** Error **********
ERROR: invalid input syntax for type json SQL state: 22P02 Detail: The
input string ended unexpectedly. Context: JSON data, line 1: { COPY
import, line 1, column info: "{"
I am trying to not include things such as PHP or any other tool currently, yet if the only option is that I would certainly consider it.
What exactly do you guys think I am doing wrong? Is it the syntax, the format or am I missing something?
Thanks!

COPY assumes that each newline indicates a new record. Unfortunately, the Google Maps DistanceMatrix API is pretty-printing your response which means that it comes through as 23 rows, none of which are valid JSON.
You can get around this by piping the curl response through something like jq.
copy imports(info) from program 'curl "https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=London&destinations=Paris&key=<my_key>" --insecure | /usr/local/bin/jq "." -c'
jq has lots of useful features if you want to massage the response a bit more before stashing it in the database.

Related

Snowflake throwing error (Error parsing JSON: misplaced { )

I am trying to load json files in to snow flake using copy command.I have two files of same structure.However one file loaded without issue,the other one is throwing the error
"Error parsing JSON: misplaced { "
The simple example select parse_json($1) record from values ('{{'); also errors with Error parsing JSON: misplaced {, pos 2 so your second file probably does in fact contain invalid JSON.
Try running the statement in validation mode (e.g. copy into mytable validation_mode = 'RETURN_ERRORS';) which will return a table containing useful troubleshooting info like the line number and character of the error(s).
The docs cover this here: https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html#validating-staged-files

Loading multiple JSON records into BigQuery using the console

I'm trying to upload some data into bigquery in JSON format using the BigQuery Console as described here.
If I have a single record in a JSON file I can upload it successfully. If I put two or more records in a JSON file with newline delimination then I get this error:
Error while reading data, error message: JSON parsing error in row starting at position 0: Parser terminated before end of string
I tried searching stackoverflow and google but didn't have any luck finding any information. The two records I uploaded with newline delimination are able to upload successfully as individual records in separate JSON files.
My editor must have been adding some other character on my newlines. I went back to my original json array of records and used:
cat test.json | jq -c '.[]' > testNDJSON.json
This fixed everything.

ARM.Template from bash-script. Unterminated string. Expected delimiter:

I am writing a bash-script for uploading certificate from a linux-server to azure keyvault using the "armclient"
I follow this guide on how to use the armclient:
https://blogs.msdn.microsoft.com/appserviceteam/2016/05/24/deploying-azure-web-app-certificate-through-key-vault/
The command i want to perform is this:
ARMClient.exe PUT /subscriptions/<Subscription Id>/resourceGroups/<Server Farm Resource Group>/providers/Microsoft.Web/certificates/<User Friendly Resource Name>?api-version=2016-03-01 "{'Location':'<Web App Location>','Properties':{'KeyVaultId':'<Key Vault Resource Id>', 'KeyVaultSecretName':'<Secret Name>', 'serverFarmId':'<Server Farm (App Service Plan) resource Id>'}}"
I have created a string that populates all the fields required:
putparm=$resolved_armapi" \"{'Location':'$resolved_locationid','Properties':{'KeyVaultId':'$resolved_keyvaultid','KeyVaultSecretName':'$certname','serverFarmId':'$resolved_farmid'}}"\"
When i echo the output of the variable putparm, the result looks as expected (X-ed out names/ids):
/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/certificates/XXXX-XXXXX-XXXXX?api-version=2016-03-01 "{'Location':'Central US','Properties':{'KeyVaultId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.KeyVault/vaults/XXXXXXXX','KeyVaultSecretName':'XXXX-XXXXX-XXXXX','serverFarmId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/serverfarms/ServicePlan59154b1c-XXXX'}}"
When i run armclient put $putparm in the script i get this error:
"error": {
"code": "InvalidRequestContent",
"message": "The request content was invalid and could not be deserialized: 'Unterminated string. Expected delimiter: \". Path '',
line 1, position 21.'." }
But when i take the output of the $putparm variable and run the command "manually" on the server, it works.
I guess its something with the way linux store the variables and that the API is requesting JSON (or something..)
Happy for any help.
The way you define your variable putparam is wrong.
It is likely interpreted as a literal string and not as an object. Note that a simple string, like "hello", is a valid JSON data, but it probably not what is expecting your server.
If you should quote your variable correctly:
putparm="{\"Location\":\"$resolved_locationid\",\"Properties\":{\"KeyVaultId\":\"$resolved_keyvaultid\",\"KeyVaultSecretName\":\"$certname\",\"serverFarmId\":\"$resolved_farmid\"}}"
and use it like this:
armclient put "$resolved_armapi" "$putparm"

How to load OSM (GeoJSON) data to ArangoDB?

How I can load OSM data to ArangoDB?
I loaded data sed named luxembourg-latest.osm.pbf from OSM, than converted it to JSON with OSMTOGEOJSON, after I tried to load result geojson to ArangoDB with next command: arangoimp --file out.json --collection lux1 --server.database geodb and got hude list of errors:
...
2017-03-17T12:44:28Z [7712] WARNING at position 719386: invalid JSON type (expecting object, probably parse error), offending context: ],
2017-03-17T12:44:28Z [7712] WARNING at position 719387: invalid JSON type (expecting object, probably parse error), offending context: [
2017-03-17T12:44:28Z [7712] WARNING at position 719388: invalid JSON type (expecting object, probably parse error), offending context: 5.867441,
...
What I am doing wrong?
upd: it's seems that converter osm2json converter should be run with option osmtogeojson --ndjson that produce items not as single Json, but in line by line mode.
As #dmitry-bubnenkov already found out, --ndjson is required to produce the right input for ArangoImp.
One has to know here, that ArangoImp expects a JSON-Subset (since it doesn't parse the json on its own) dubbed as JSONL.
Thus, Each line of the JSON-File is expected to become one json document in the collection after the import. To maximize performance and simplify the implementation, The json is not completely parsed before sending it to the server.
It tries to chop the JSON into chunks with the maximum request size that the server permits. It leans on the JSONL-line endings to isolate possible chunks.
However, the server expects valid JSON for sure. Sending the chopped part to the server with possibly incomplete JSON documents will lead to parse errors on the server, which is the error message you saw in your output.

Trouble following Encrypted Big-Query tutorial document

I wanted to try out the encrypted big query client for google big query and I've been having some trouble.
I'm following the instructions outlined in this PDF:
https://docs.google.com/file/d/0B-WB8hYCrhZ6cmxfWFpBci1lOVE/edit
I get to the point where I'm running this command:
ebq load --master_key_filename="key_file" testdataset.cars cars.csv cars.schema
And I'm getting an error string which ends with:
raise ValueError("No JSON object could be decoded")
I've tried a few different formats for my .csv and .schema files but none have worked. Here are my latest versions.
cars.schema:
[{"name": "Year", "type": "integer", "mode": "required", "encrypt": "none"}
{"name": "Make", "type": "string", "mode": "required", "encrypt": "pseudonym"}
{"name": "Model", "type": "string", "mode": "required", "encrypt": "probabilistic_searchwords"}
{"name": "Description", "type": "string", "mode": "nullable", "encrypt": "searchwords"}
{"name": "Website", "type": "string", "mode": "nullable", "encrypt": "searchwords","searchwords_separator": "/"}
{"name": "Price", "type": "float", "mode": "required", "encrypt": "probabilistic"}
{"name": "Invoice_Price", "type": "integer", "mode": "required", "encrypt": "homomorphic"}
{"name": "Holdback_Percentage", "type": "float", "mode": "required", "encrypt":"homomorphic"}]
cars.csv:
1997,Ford,E350, "ac\xc4a\x87, abs, moon","www.ford.com",3000.00,2000,1.2
1999,Chevy,"Venture ""Extended Edition""","","www.cheverolet.com",4900.00,3800,2.3
1999,Chevy,"Venture ""Extended Edition, Very Large""","","www.chevrolet.com",5000.00,4300,1.9
1996,Jeep,Grand Cherokee,"MUST SELL! air, moon roof,loaded","www.chrysler.com/jeep/grand­cherokee",4799.00,3950,2.4
I believe the issue may be that you need to move the --master_key_filename argument before the load argument. If that doesn't work, can you send the output of adding --apilog=- as the first argument?
Also, there is an example script file of running ebq here:
https://code.google.com/p/bigquery-e2e/source/browse/#git%2Fsamples%2Fch13