S3 to Redshift: Unknown boolean format - json

Executing a copy command from S3 to Redshift, loading JSON files. I have some fields as bool in a new table I'm inserting into and always getting the following error: "Unknown boolean format"
My JSON is well parsed, ran a million tests on that already. I've tried passing in the boolean fields as:
false // "false" // "False" // 0 // "0" // null
But always get the same error, when executing:
select * from stl_load_errors;
err_code err_reason
1210 Unknown boolean format
I've seen some comments about using IGNOREHEADER in my statement but that isn't an option because the files I'm dealing with are in a single row json format. Ignoring the head would basically mean not reading thefile at all. I have other tables working like this and work fine, but don't have any bool columns in those tables.

The COPY from JSON Format documentation page provides an example that includes a Boolean:
{
"id": 0,
"guid": "84512477-fa49-456b-b407-581d0d851c3c",
"isActive": true,
"tags": [
"nisi",
"culpa",
"ad",
"amet",
"voluptate",
"reprehenderit",
"veniam"
],
"friends": [
{
"id": 0,
"name": "Carmella Gonzales"
},
{
"id": 1,
"name": "Renaldo"
}
]
}
The Boolean documentation page shows values similar to what you have tried.

Related

Selecting in States.StringToJson function

It there is a way to process the result of States.StringToJson intesic function directly ? 
Currently in a step function, I try to handle the error from another synchronous step function call :
"OtherStepFunction": {
"Type": "Task",
"Resource": "arn:aws:states:::states:startExecution.sync:2",
"Parameters": {
"StateMachineArn": "otherstepFunctionCall",
"Input.$": "$"
},
"End": true,
"Catch": [
{
"ErrorEquals": [
"States.ALL"
],
"Comment": "OtherStepFunctionFailed",
"Next": "StatusStepFunctionFailed",
"ResultPath": "$.error"
}
]
},
All errors goes in a pass flow named StatusStepFunctionFailed, with the errors output in $.error path.
The $.error is composed of the error type and the cause as an escapedJson string.
"error": {
"Error": "States.TaskFailed",
"Cause": "{\"ExecutionArn\":\"otherfunctionarm:executionid\",\"Input\":\"foooooo\"}"
}
Is there any way to extract only the ExecutionARN from this input ? In my pass step, I convert the Cause path as a json, but i didn't find a way to select directly the ExectionARN part. The following :
"reason.$": "States.JsonMerge($.error.Cause).ExecutionArn"
return The value for the field 'reason.$' must be a valid JSONPath or a valid intrinsic function call (at /States/HandleResource/Iterator/States/StatusStepFunctionFailedHandleJSON/Parameters)
My current workaround is to use 2 pass flow, first convert the output and then formating.
I had a similar issue.
What I did was create a task to put the Cause into a new path parameter using StringToJSON. I put that task as the next from the error and then called the subsequent task from that one.
Using your variable names and values:
In the Catch, change the Next from StatusStepFunctionFailed to parseErrorCause
Then parseErrorCause is like this:
"parseErrorCause": {
"Type": "Pass",
"Parameters": {
"Result.$": "States.StringToJson($.error.Cause)"
},
"ResultPath": "$.parsedJSON",
"Next": "StatusStepFunctionFailed"
},
And StatusStepFunctionFailed accesses
"Variable": "$.parsedJSON.Result.Input",
to get foooooo

Symfony, Doctrine truncates string in Json filed before storing in database

I'm facing a weird bug when storign some Json data in my Database with Doctrine in a Symfony 4 application.
Some strings in the json data are truncated over 27 characters and [...] is added at the end, but not always !!
Here's an example of the data I got in my DB :
{
"tests": {
"test-1": {
"label": "Test 1",
"someData": null,
"uid": "044e0907-82cc-4f53-a325-e62830e59523"
},
"test-2": {
"label": "Test 2",
"someData": null,
"uid": "a204b0a7-0831-4fde-976c-f3a1b0e75655"
},
"test-3": {
"label": "Test 3",
"someData": null,
"uid": "d8f457b1-67d6-4ff7-9378-6c0ce5d9de0a"
},
"test-4": {
"label": "Test 4",
"someData": null,
"uid": "5ddbd2eb-142c-4fbb-a4bc-d6 [...]" // Here is the bug !!!
},
"test-5": {
"label": "Test 5",
"someData": null,
"uid": "e2ee7a1a-e0ae-4f1d-8806-967d94ddb790"
}
}
}
I spent time to debug to find where it could come, and before I flush my entity, the data of the property is ok, but after the flush, sometime, some of the uids (that are longer than 27 characters) are truncated.
$myEntity->setField($field);
$challenge->getField(); // Here the data is OK
$this->doctrine->getManagerForClass(MyEntity::class)->flush();
$challenge->getField(); // Here the data is truncated sometimes
Any idea where this bug could come from?
Doctrine ? Database (I use MySQL) ?
Thanks!
After some digging I finally found the bug, and it was my code's fault ^^
To explain, before the flush, I made some transform to my array data, using a foreach loop with the value as reference.
So the data array passed to the flush function kep the last item's value as reference. So, when the DbalLogger enters in action to log the query, it has the normalizeParams() function that shorten too long strings.
And as some param value was passed by reference, it was shorten as well before being stored in DB!
Conclusion: beware of passing reference in a foreach loop ;)

mySQL JSON Document Store method for inserting data into node 3 levels deep

I want to take the data from here: https://raw.githubusercontent.com/usnistgov/oscal-content/master/examples/ssp/json/ssp-example.json
which I've pulled into a mySQL database called "ssp_models" into a JSON column called 'json_data', and I need add a new 'name' and 'type' entry into the 'parties' node with a new uuid in the same format as the example.
So in my mySQL database table, "ssp_models", I have this entry: Noting that I should be able to write the data by somehow referencing "66c2a1c8-5830-48bd-8fdd-55a1c3a52888" as the record to modify.
All the example I've seen online seem to force me to read out the entire JSON into a variable, make the addition, and then cram it back into the json_data column, which seems costly, especially with large JSON data-sets.
Isn't there a simple way I can say
"INSERT INTO ssp_models JSON_INSERT <somehow burrow down to 'system-security-plan'.metadata.parties (name, type) VALUES ('Raytheon', 'organization') WHERE uuid = '66c2a1c8-5830-48bd-8fdd-55a1c3a52888'
I was looking at this other stackoverflow example for inserting into JSON:
How to create and insert a JSON object using MySQL queries?
However, that's basically useful when you are starting from scratch, vs. needing to add JSON data to data that already exists.
You may want to read https://dev.mysql.com/doc/refman/8.0/en/json-function-reference.html and explore each of the functions, and try them out one by one, if you're going to continue working with JSON data in MySQL.
I was able to do what you describe this way:
update ssp_models set json_data = json_array_append(
json_data,
'$."system-security-plan".metadata.parties',
json_object('name', 'Bingo', 'type', 'farmer')
)
where uuid = '66c2a1c8-5830-48bd-8fdd-55a1c3a52888';
Then I checked the data:
mysql> select uuid, json_pretty(json_data) from ssp_models\G
*************************** 1. row ***************************
uuid: 66c2a1c8-5830-48bd-8fdd-55a1c3a52888
json_pretty(json_data): {
"system-security-plan": {
"uuid": "66c2a1c8-5830-48bd-8fdd-55a1c3a52888",
"metadata": {
"roles": [
{
"id": "legal-officer",
"title": "Legal Officer"
}
],
"title": "Enterprise Logging and Auditing System Security Plan",
"parties": [
{
"name": "Enterprise Asset Owners",
"type": "organization",
"uuid": "3b2a5599-cc37-403f-ae36-5708fa804b27"
},
{
"name": "Enterprise Asset Administrators",
"type": "organization",
"uuid": "833ac398-5c9a-4e6b-acba-2a9c11399da0"
},
{
"name": "Bingo",
"type": "farmer"
}
]
}
}
}
I started with data like yours, but for this test, I truncated everything after the parties array.

Karate- Error when comparing json objects

I am trying to match some parameters with json reposne
my actual response is like
{
"timestamp": 1595994767386,
"country": "MH",
"accessible_device_types": [
{
"name": "ESS Client",
"raw_name": "ABC",
"permission": 7,
"permission_bits": {
"INSTALL_LIMITED_RELEASE_SOFTWARE": true,
"INSTALL_LATEST_SOFTWARE_ONLY": true,
"INSTALL_SOFTWARE": true
}
},
used below statment for comparing:
match response.accessible_device_types contains [{"raw_name": "ABC"}]
Reason for error from report: expected: {raw_name=ABC}, reason: actual value does not contain expected
Looks like comparing without quotes. Why is it taking out the quotes? Any recommendations
How to compare "INSTALL_SOFTWARE": true
2 options:
* def nameAbc = {"raw_name": "ABC"}
* match response.accessible_device_types contains '#(^nameAbc)'
This will work in 0.9.6.RC4 onwards:
* match response.accessible_device_types contains deep {"raw_name": "ABC"}

Querying nested arrays of hashes in Postgres JSON datatype?

So I've read through the Postgres JSON querying examples and I'm pretty sure there's no way to query our data as structured, but I wanted to make absolutely sure as it will require multiple changes to our code base :)
We are currently storing the following data in a serialized text field in Postgres. We would like to be able to query against the values, which are either nil or arrays-of-hashes [{a : 1, b: 2},{c: 3, d: 4}] to see if the hash values themselves contain a hash witch has a certain value (in the example below facebook or 102).
{ "statuses": nil,
"services_and_accounts": [
{
"id": "facebook",
"enabled": false
},
{
"id": 102,
"enabled": false
}
]
}
Is this possible in pure SQL using the JSON datatype?