when copying a database from one host to another I get the folowing error : Missing JSON list of 'docs'
Here is what I do :
source> curl -X GET http://127.0.0.1:5984/cozy/_all_docs?include_docs=true > cozy.dump
destination> curl -X PUT http://127.0.0.1:5984/cozy
{"ok":true}
destination> curl -d #cozy.dump -H "Content-type: application/json" -X POST http://localhost:5984/cozy/_bulk_docs
{"error":"bad_request","reason":"Missing JSON list of 'docs'"}
any idea ?
Thanks !
This is, indeed, a problem with versions.
Fortunately it is fairly easy to fix: just change the first line in the dump, eg.
{"total_rows": 8244, "offset": 0, "rows": [
to
{"docs": [
The dumps can now be used in the later versions.
I know this is an old question but I am still posting an answer in case some one else is looking for the solution. The bulk docs api accepts the request in a certain form.
{docs:[{},{},{}]}
The docs key must contain an array of documents to be bulk inserted. What op did with
curl -X GET http://127.0.0.1:5984/cozy/_all_docs?include_docs=true > cozy.dump
was that he simply stored the couchdb response of the format
{
total_rows: 4,
offset: 0,
rows: [....]
}
into the cozy.dump file. As we have seen above this file is not in a form that can be consumed by the bulk docs api. Hence the error
{"error":"bad_request","reason":"Missing JSON list of 'docs'"}
Couchdb needs a JSON list of docs to perform the bulk insert.
Another point to be noted here is that if you supply an _id and _rev parameter couchdb performs a bulk update rather than a bulk insert. If you just want to copy one database to another use http://wiki.apache.org/couchdb/Replication
Related
I'm using curl to send some json data. Part of the data is a request counter that needs to be incremented after each call.
I would like to reduce the code below by incrementing it right after evaluating it. I'm not sure how to format the variable within the json string though.
Thank you in advance!
#!/bin/bash
reqcnt=0
curl http://myurl.com --data-binary '{"requestCounter":'${reqcnt}'}'
((reqcnt++))
Expected:
#!/bin/bash
reqcnt=0
curl http://myurl.com --data-binary '{"requestCounter":'${((reqcnt++)}'}'
Edit
Taking into account the great answer by Inian, I noticed there are cases where I need to save the output of curl. For some reason the arithmetic operation is not performed on the variable in that case:
res=$(curl http://myurl.com --data-binary {"requestCounter":'"$((reqcnt++))"'}')
I want to use an influxdb within the context of business intelligence:
ETL, joining data from other databases, creating live dashboards.
Right now, we are using standard BI-Tools such as QLIK or Microsoft PowerBI.
According to the documentation the HTTP API should be used for querying (https://docs.influxdata.com/influxdb/v1.2/guides/querying_data/)
My problem is that the output of the API seems to be JSON only. That means that each analyst first has to figure out how to transform the JSON into table-format before joining other data etc.
Is it possible to tell the API to produce a csv-like table output?
Do you have recommendations which tools to use to produce good Dashboards? I tried grafana but that seemed to fall short when joining other data.
You can use -H "Accept: application/csv" in your curl to have a response in CSV. For instance:
$ curl -G 'http://localhost:8086/query' --data-urlencode "db=my_db" --data-urlencode "q=SELECT * FROM \"cpu\"" -H "Accept: application/csv"
name,tags,time,host,region,value
cpu,,1493031640435991638,serverA,us_west,0.64
You can use jq to convert the JSON output to CSV as follows, which also allows you to get RFC3339 formatted timestamps:
jq -r "(.results[0].series[0].columns), (.results[0].series[0].values[]) | #csv"
which gives the output
"time","ppm","T"
"2019-01-17T19:45:00Z",864.5,18.54
"2019-01-17T19:50:00Z",861.4,18.545
"2019-01-17T19:55:00Z",866.2,18.5
"2019-01-17T20:00:00Z",863.9,18.47
and works because:
(.results[0].series[0].columns) gets the column names as array
, concatenates the output
(.results[0].series[0].values[]) gets the data values as array
| #csv uses the jq csv formatter
-r is used to get raw output
Further resources:
Use https://jqplay.org/ to build queries
Other examples: Convert JSON array into CSV using jq
https://unix.stackexchange.com/questions/429241/convert-json-to-csv
How to convert arbirtrary simple JSON to CSV using jq?
I'm new to couchdb and I want to populate my database using Git Bash. I made a text document with the following:
{
"_id" : "_design/example",
"views" : {
"foo" : {
"map" : "function(doc){ emit(doc._id, doc._rev)}"
}
}
}
I named it design.json and in Git Bash I type curl -X PUT http://<username>:<password>#localhost:5984/testdb/_design/example --data-binary ‘#design.json’
My problem is that instead of adding the file to my database I get a warning:
Warning: Couldn't read data from file "design.json", this makes an empty POST.
followed by an error:
{"error":"bad_request","reason":"invalid_json"}
The JSON you posted plus the curl call work for me – but of course only if I change the ‘...’ smart quotes from your posting to '...'. In the absence of another reason to fail, I suggest this is the problem.
I've researched this and I still can't quite get it right as it says my POST fields are not set or empty. So at a guess this would be a syntax problem?
I have two fields I'm trying to POST, one called "app_hash" which is a string and one called "data" which is a well formatted JSON array containing the data.
So far I have:
curl -H "Content-Type: application/json" -X POST -d '{"app_hash":"ThisIsAnAppHash123456","data":"{schedule:{schedule_id:"93",round1:"0",round2:"0",round3:"0",round4:"0",start_prompt:"0",notify_taken:"0",notify_missed:"0"}}"}' https://myurl.com/app/save_settings.php --verbose
I have set error messages to be returned in JSON to help me diagnose the issue and it definitely says the PHP script I'm trying to CURL thinks that my POST fields are empty or blank.
Any help would be greatly appreciated and if you could explain why I haven't got it right yet it would justify the amount of time I've spent researching this haha. Thank You.
The problem is most likely to be your use of the option -d, which doesn't do quite what one might guess.
The -d option is equivalent to the --data-ascii option, which encodes its argument before sending it as application/x-www-form-urlencoded. What you want is to use --data-binary, which sends its argument unchanged.
Yes, I think the options are unfortunately named; yes, I think it's unfortunate that --data-ascii is the one abbreviated to -d; yes, this has caught me out on more than one occasion before.
So very basic question about elasticsearch which the docs not answer very clearly (because they seem to go into many advanced details but miss the basic ones!).
Example: range query
http://www.elasticsearch.org/guide/reference/query-dsl/range-query.html
Doesn't tell how to PERFORM the range, is it via the search endpoint?
And if it is, then how to do it via querystring? I mean, I want to do a GET, not a POST (because it's a query, not an insertion/modification). However the documention for GET requests doesn't tell how to use JSON like in the Range sample:
http://www.elasticsearch.org/guide/reference/api/search/uri-request.html
What am I missing?
Thanks
Use the Lucene query syntax:
curl -X GET 'http://localhost:9200/my_index/_search?q=my_field:[0+TO+25]&pretty'
Let's assume we have an index
curl -XPUT localhost:9200/test
And some documents
curl -XPUT localhost:9200/test/range/1 -d '{"age": 9}'
curl -XPUT localhost:9200/test/range/2 -d '{"age": 12}'
curl -XPUT localhost:9200/test/range/3 -d '{"age": 16}'
Now we can query these documents within a certain range via
curl -XGET 'http://localhost:9200/test/range/_search?pretty=true' -d '
{
"query" : {
"range" : {
"age" : {
"from" : "10",
"to" : "20",
"include_lower" : true,
"include_upper": true
}
}
}
}
'
This will return the documents 2 and 3.
I'm not sure if there is a way to perform these kind of complex queries via URI request, though.
Edit: Thanks to karmi here is the solution without JSON request:
curl -XGET --globoff 'localhost:9200/test/range/_search?q=age:["10"+TO+"20"]&pretty=true'
Replying to myself thanks to #javanna:
In the RequestBody section of the Search docs:
http://www.elasticsearch.org/guide/reference/api/search/request-body.html
At the end, it says:
The rest of the search request should be passed within the body itself. The body content can also be passed as a REST parameter named source.
So I guess that I need to use the search endpoint with the source attribute to pass json.