I have some JSON I'm trying to insert into a Postgres database but I can't manage to properly escape the quotes, here's my code
insert into Product_Templates (product) values( '{
"template_id": "OSBSheet",
"name":'Exterior Wall Using 2\"x4\"x96\" Studs, Double Top Plate'
}
I get the error:
invalid command \"x96
How do I work around this?
See the JSON syntax. The keys and string values in JSON are enclosed in double quotes. The quotation marks in strings must be preceded by a "\" character:
select
'{
"template_id": "OSBSheet",
"name": "Exterior Wall Using 2\"x4\"x96\" Studs, Double Top Plate"
}'::jsonb
jsonb
-------------------------------------------------------------------------------------------------
{"name": "Exterior Wall Using 2\"x4\"x96\" Studs, Double Top Plate", "template_id": "OSBSheet"}
(1 row)
Related
I would like to pretty print a json string I copied from an API call which contains escaped double quotes.
Similar to this:
"{\"name\":\"Hans\", \"Hobbies\":[\"Car\",\"Swimming\"]}"
However when I execute pbpaste | jq "." the result is not changed and is still in one line.
I guess the problem are the escaped double quotes, but I don't know how to tell jq to remove them if possible, or any other workaround.
What you have is a JSON string which happens to contain a JSON object. You can decode the contents of the string with the fromjson function.
$ pbpaste | jq "."
"{\"name\":\"Hans\", \"Hobbies\":[\"Car\",\"Swimming\"]}"
$ pbpaste | jq "fromjson"
{
"name": "Hans",
"Hobbies": [
"Car",
"Swimming"
]
}
I have a simple file with this test line:
mmm#gmail.com 31460 147557432
My goal is to send as json data.
In my while loop I can echo the variables in the second line of my code example.
However, when I attempt to assign them to jsonstring and echo, the values are not populated.
What do I need to do to pass these values to my json string?
while read emailvar idvar expirevar; do
echo "$emailvar:$expirevar:$idvar"
jsonstring=$idvar $emailvar $expirevar
echo "$jsonstring"
#jsonstring='{"user_id":"$idvar","email":"$emailvar","custom_attributes":{"Program_Expires_at":"$expirevar"}}'
done < "tempdata.txt"
#!/bin/bash
while read line;
do
line_array=($line)
emailvar=${line_array[0]}
expirevar=${line_array[1]}
idvar=${line_array[2]}
jsonstring='{"user_id": "'$idvar'", "email": "'$emailvar'", "custom_attributes":{"Program_Expires_at": "'$expirevar'"}'
echo $jsonstring
done < 'tempdata.txt'
Output:
You have to escape the whitespace to make it part of the string, rather than creating a simple command with some pre-command assignments.
jsonstring=$idvar\ $emailvar\ $expirevar
more commonly written as
jsonstring="$idvar $emailvar $expirevar"
In your commented assignment, you used single quotes, which prevent parameter expansion. You need to use double quotes, which requires manually escaping the interior double quotes. More robust, though, is to use a tool like jq to generate the JSON for you: it will take care of escaping any characters in your variables to generate valid JSON.
jsonstring=$(jq -n \
--arg id "$idvar" \
--arg email "$emailvar" \
--arg expire "$expirevar" \
'{user_id: $id,
email: $email,
custom_attributes: {Program_Expires_at: $expire}}'
)
It seems this is essentially a problem with how bash handles variables and parameter expansion. I believe the solution here basically adding bunch of double quotes.
Double quotes can be used to enable parameter expansion for multiple variables. For JSON output in this bash script, we'll need to use nested double-quotes.
To fix this, we can:
put double quotes (") surrounding the value for jsonstring
escape double quotes surrounding strings used within the value for jsonstring with \
If you'd like $idvar and $expirevar to be interpreted as numbers instead of strings, you don't need escaped double-quotes around these values.
For example:
#!/bin/bash
while read emailvar idvar expirevar; do
jsonstring="{\"user_id\":$idvar,\"email\":\"$emailvar\",\"custom_attributes\":{\"Program_Expires_at\":$expirevar}}"
echo "$jsonstring"
done < "tempdata.txt"
Example output:
user#pc: bash ./script.sh
{"user_id":31460,"email":"mmm#gmail.com","custom_attributes":{"Program_Expires_at":147557432}}
user#pc: bash ./script.sh | jq .
{
"user_id": 31460,
"email": "mmm#gmail.com",
"custom_attributes": {
"Program_Expires_at": 147557432
}
}
I'm using JsonParser.parseString which is passing if i give a value like "1234" or "abcd" (ignore case). Can anyone explain this?
Your first example is valid JSON for a number. Using JavaScript's parser:
console.log(JSON.parse('1234'));
It seems unlikely to me that a JSON parser was happy with your second example, because it's not valid:
console.log(JSON.parse('abcd'));
...unless it actually had quotes around it, in which case it's valid JSON for a string:
console.log(JSON.parse('"abcd"'));
But even objects don't have to have whitespace, for instance:
console.log(JSON.parse('{"example":"object"}'));
I'm using JsonParser.parseString which is passing if i give a value like "1234" or "abcd" (ignore case). Can anyone explain this?
The current specification for application/json is RFC 8259
Highlights from the production rules:
JSON-text = ws value ws
value = false / null / true / object / array / number / string
number = [ minus ] int [ frac ] [ exp ]
int = zero / ( digit1-9 *DIGIT )
string = quotation-mark *char quotation-mark
quotation-mark = %x22 ; "
Therefore:
A sequence of digits (ex: 1234) is a valid JSON-text
A sequence of digits enclosed in double quotes (ex: "1234") is a valid JSON-text
A sequence of "unescaped" characters enclosed in double quotes (ex: "abcd") is a valid JSON-text
A sequence of "unescaped" characters without enclosing double quotes (ex: abcd) is not a valid JSON-text.
cat <<TEST | jq .
"abcd"
TEST
"abcd"
cat <<TEST | jq .
abcd
TEST
parse error: Invalid numeric literal at line 2, column 0
I have the following JSON:
{
"name": "foo \\uNo bar"
}
I'm trying to load this into Snowflake using a STAGE on S3. This is in a CSV file like:
{"name": "foo \\uNo bar"}
However, when I try to load it, Snowflake breaks with an Error parsing JSON message. If I try to load it directly on Snowflake console, as SELECT PARSE_JSON('{"name": "foo \\uNo bar"}'), I get:
Error parsing JSON: hex digit is expected in \u???? escape sequence, pos 17
The problem is that Snowflake is parsing the string, checking for an unicode digit \uNo (which doesn't exist). How can I disable this?
The default FILE FORMAT for parsing CSVs in Snowflake is interpreting the double backslash string '{"name": "foo \\uNo bar"}' as an escape sequence for the character \ which means that the character sequence \uNo is getting passed to PARSE_JSON which then fails because \uNo not a valid escape sequence for a JSON string. You can prevent this by overriding the FILE FORMAT escape sequence settings.
Given this CSV file:
JSON
'{"name": "foo \\uNo bar"}'
And the following CREATE TABLE and COPY INTO statements:
CREATE OR REPLACE TABLE JSON_TEST (JSON TEXT);
COPY INTO JSON_TEST
FROM #my_db.public.my_s3_stage/json.csv
FILE_FORMAT = (TYPE = CSV
SKIP_HEADER = 1
FIELD_OPTIONALLY_ENCLOSED_BY = '\''
ESCAPE = NONE
ESCAPE_UNENCLOSED_FIELD = NONE);
I am able to parse there result as JSON:
SELECT PARSE_JSON(JSON) FROM JSON_TEST;
Which returns
+-----------------------------+
| JSON |
+-----------------------------|
| { "name": "foo \\uNo bar" } |
+-----------------------------+
I can't really figure this out - the SQL query outputed is not valid
key="test"
payload=$(gzip -ckqd ./temp.json.gz | jq -c . | sed 's/"/\\"/g')
printf 'INSERT INTO my_table VALUES ("%s", "%s")' "$key" "$payload" | sqlite3 ./temp.db
Obiously the $payload variable is a json string (can have single and double quotes etc)
In SQL, strings are delimited not with double quotes but with single quotes.
Inside a string, the only special character is the single quote itself, and it must be escaped not with a backslash, but with another quote:
... sed "s/'/''/g"