mongoimport - an empty object { } field value imports as null value - json

I want to manually add some data into a new mongodb. For this I write the json files and run mongoimport for importing.
This is an example of an original file:
{
"val": {}
}
Yet, in the new database, the value {} is turned into null:
{
"_id": ObjectId("6023b9a532d713e97f5dc70c"),
"val": null
}
I don't understand why this is happening. Is there a way to prevent this?
Due to some restrictions I have to use the --legacy flag on mongoimport.
My versions are:
$ mongoimport --version
mongoimport version: 100.2.0
$ mongod --version
db version v4.4.1

Related

cbimport not importing file which is extracted from cbq command

I tried to extract data from below cbq command which was successful.
cbq -u Administrator -p Administrator -e "http://localhost:8093" --script= SELECT * FROM `sample` where customer.id=="12345'" -q | jq '.results' > temp.json;
However when I am trying to import the same data in json format to target cluster using below command I am getting error.
cbimport json -c http://{target-cluster}:8091 -u Administrator -p Administrator -b sample -d file://C:\Users\{myusername}\Desktop\temp.json -f list -g %docId%
JSON import failed: 0 documents were imported, 0 documents failed to be imported
JSON import failed: input json is invalid: ReadArray: expect [ or , or ] or n, but found {, error found in #1 byte of ...|{
"requ|..., bigger context ...|{
"requestID": "2fc34542-4387-4643-8ae3-914e316|...],```
```{
"requestID": "6ef38b8a-8e70-4c3d-b3b4-b73518a09c62",
"signature": {
"*": "*"
},
"results": [
{
"{Bucket-name}":{my-data}
"status": "success",
"metrics": {
"elapsedTime": "4.517031ms",
"executionTime": "4.365976ms",
"resultCount": 1,
"resultSize": 24926
}
It looks like the file which was extracted from cbq command has control fields details like RequestID, metrics, status etc. Also json in pretty format. If I manually remove it(remove all fields except {my-data}) then put in a json file and make json unpretty then it works. But I want to automate it in a single run. Is there a way to do it in cbq command.
I don't find any other utility or way to use where condition on cbexport to do that on Couchbase, because the document which are exported using cbexport can be imported using cbimport easily.
For the cbq command, you can use the --quiet option to disable the startup connection messages and the --pretty=false to disable pretty-print. Then, to extract just the documents in cbimport json lines format, I used jq.
This worked for me -- selecting documents from travel-sample._default._default (for the jq filter, where I have _default, you would put the Bucket-name, based on your example):
cbq --quiet --pretty=false -u Administrator -p password --script='select * from `travel-sample`._default._default' | jq --compact-output '.results|.[]|._default' > docs.json
Then, importing into test-bucket1:
cbimport json -c localhost -u Administrator -p password -b test-bucket1 -d file://./docs.json -f lines -g %type%_%id%
cbq documentation: https://docs.couchbase.com/server/current/tools/cbq-shell.html
cbimport documentation: https://docs.couchbase.com/server/current/tools/cbimport-json.html
jq documentation:
https://stedolan.github.io/jq/manual/#Basicfilters

How to generate a 'label' using a json file in app configuration service?

I'm trying to import a json file into Azure app configuration service using cli command:
az appconfig kv import.
Sample json file
{
"Pss": {
"account/getall/get": "read",
"account/setall/put": "write",
"account/someendpoint/somevalue": "profile"
}
}
I can see below preview in cli
Adding:
{"key": "Pss:account/getall/get", "value": "\"read\""}
{"key": "Pss:account/setall/put", "value": "\"write\""}
{"key": "Pss:account/someendpoint/somevalue", "value": "\"profile\""}
Labels are created as (No label) in app configuration service.
Could you please suggest to me what changes need to be done in the json file to generate label values?
Thanks in advance.
Below command will helps you to get the label name:
az appconfig kv import --name hkappconfig --label testingLabelName --source file --path /home/hari/Import.json --format json --separator . --content-type "application/json
By adding the attribute --label labelName in the Importing az cli command, You will get the label name in the configuration explorer of the app configuration.
Output:

How can I pass Rundeck variables to a JSON file?

I have a JSON with key pairs and I want to access the values from Rundeck Options dynamically during the job execution.
For shell script, we can do a $RD_OPTIONS_<>.
Similarly is there some format I can use in a JSON file?
Just use #option.myoption# in a inline-script step.
You need a tool to use on an inline script step to manipulate JSON files on Rundeck. I made an example using JQ. Alternatively, you can use bash script-fu to reach the same goal.
For example, using this JSON file:
{
"books": [{
"fear_of_the_dark": {
"author": "John Doe",
"genre": "Mistery"
}
}]
}
Update the file with the following jq call:
To test directly in your terminal
jq '.books[].fear_of_the_dark += { "ISBN" : "9999" }' myjson.json
On Rundeck Inline-script
echo "$(jq ''.books[].fear_of_the_dark += { "ISBN" : "#option.isbn#" }'' myjson.json)" > myjson.json
Check how looks on an inline-script job (check here to know how to import the job definition to your Rundeck instance).
- defaultTab: nodes
description: ''
executionEnabled: true
id: d8f1c0e7-a7c6-43d4-91d9-25331cc06560
loglevel: INFO
name: JQTest
nodeFilterEditable: false
options:
- label: isbn number
name: isbn
required: true
plugins:
ExecutionLifecycle: null
scheduleEnabled: true
sequence:
commands:
- description: original file content
exec: cat myjson.json
- description: pass the option and save the content to the json file
fileExtension: .sh
interpreterArgsQuoted: false
script: 'echo "$(jq ''.books[].fear_of_the_dark += { "ISBN" : "#option.isbn#"
}'' myjson.json)" > myjson.json'
scriptInterpreter: /bin/bash
- description: modified file content (after jq)
exec: cat myjson.json
keepgoing: false
strategy: node-first
uuid: d8f1c0e7-a7c6-43d4-91d9-25331cc06560
Finally, check the result.
Here you can check more about executing scripts on Rundeck and here more about the JQ tool.

Use a mysql command to Insert a row into a db table using Terraform

I am using terraform and want to insert a row into a db. I am using a null_resource with a mysql command. I was trying to figure out what the command would look like using terraform
I think the issue is with mysql command
missing delimiter for 'u' glob qualifier?
variable "environment" {
default = "prod"
}
resource "null_resource" "example" {
provisioner "local-exec" {
command = "-u${username} -p${password} --port port_number -h 00.0.0.0 INSERT into table (column1,column2) VALUES(${var.values},${var.values});"
interpreter = ["mysql", "-Command"]
environment = "${var.environment}"
}
}

Can't upsert JSON with mongoimport

I want to use JSON to batch upsert to a mongo collection.
$ mongoexport -d myDB -c myCollection
connected to: 127.0.0.1
{ "_id" : "john", "age" : 27 }
But using the syntax I would in the mongo shell yields:
0$ echo '{_id:"john", {$set:{gender:"male"}}' | mongoimport --upsert --upsertFields _id -d myDB -c myCollection
connected to: 127.0.0.1
Fri Jul 27 15:01:32 Assertion: 10340:Failure parsing JSON string near: , {$set:{g
0x581a52 0x528554 0xa9f2e3 0xaa1593 0xa980cd 0xa9c062 0x3e7ca1ec5d 0x4fe239
...
/lib64/libc.so.6(__libc_start_main+0xfd) [0x3e7ca1ec5d] mongoimport(__gxx_personality_v0+0x3c9) [0x4fe239]
exception:Failure parsing JSON string near: , {$set:{g
imported 0 objects
encountered 1 error
When I try it without the curly brackets, it yields no error but doesn't change the table:
0$ echo '{_id:"john", $set:{gender:"male"}}' | mongoimport --upsert --upsertFields _id -d myDB -c myCollection
connected to: 127.0.0.1
imported 1 objects
0$ mongoexport -d myDB -c myCollection
connected to: 127.0.0.1
{ "_id" : "john", "age" : 27 }
exported 1 records
I've searched everywhere but can't find an example using JSON. Please help!
To the best of my knowledge, MongoImport doesn't evaluate commands.
Just to add to Andre's answer.
Mongoimport takes a single file that contains 1 JSON/CSV/TSV string per line and inserts it. You can import from standard out but not as a command as above. You can use mongoimport to perform an upsert as per here.
You can run mongoimport with the stoponError option, which will force mongoimport to stop when it encounters an error.
Here's the complete manual for mongoimport and, as a FYI, mongoimport doesn't reliably preserve all rich BSON data types.
Mongoimport does not take modifiers such as your $set. You will need to use the mongo --eval command to update.
mongo myDB --eval 'db.myCollection.update({_id: "john"}, {$set:{gender:"male"}}, upsert=true)'
Hope this helps.