Couchdb returning Invalid Json - json

Trying to learn Views in CouchDB, from the book, but I keep coming across this issue, for some reason, this is a bad request and is invalid JSON:
{
"_id" : "_design/example",
"language" : "javascript",
"views" : {
"foo" : {
"map" : "function(doc) { if (doc.date && doc.title) { emit(doc.date, doc.title);}}"
}
}
}
Yet this is fine:
{
"_id": "_design/tyres_used",
"language": "javascript",
"views": {
"tyres": {
"map": "function(doc) { if(doc.tyres && doc.client) {\n emit(doc.tyres, doc.client);\n} \n}"
}
}
}
The only way I can upload docs at the moment is to alter the second script with the words from the first then it goes in. I have no idea what I did on the first one though to get it to work.
Apologies if this comes out looking wrong, I have never done this block quote before.

The first error you are receiving is because cURL can't find the file. Make sure the file is in the same directory where you're executing your command to specify the relative/absolute path to the file.
Next, you'll face a Content-type error. You need to specify what type of data you are sending to CouchDB.To specify the JSON Content-type header with cURL, do the following :
curl -X PUT 127.0.0.1:5984/views_testing/_design/example -d #example.json -H "Content-type: application/json"

Related

MS Graph batching create list item (SPO) Invalid batch payload format

I did see this same error, in c# has an open question, but I'm using PowerShell and using POST rather than PATCH so I've opened a separate question.
I'm having an issue when using JSON batching, particularly that when I include the header for Content-Type, I receive:
Invoke-RestMethod : {
"error": {
"code": "BadRequest",
"message": "Invalid batch payload format.",
"innerError": {
"date": "2020-10-14T00:25:46",
"request-id": "aa535dbb-efe8-450e-911d-143554ed9027",
"client-request-id": "aa535dbb-efe8-450e-911d-143554ed9027"
}
}
}
I first had missed including the headers at all, and was receiving the error:
{
"error": {
"code": "BadRequest",
"message": "Write request id : 2 does not contain Content-Type header or body.",
"innerError": {
"date": "2020-10-14T00:46:58",
"request-id": "3601be6d-a861-4947-936b-451cd9de80c3",
"client-request-id": "3601be6d-a861-4947-936b-451cd9de80c3"
}
}
}
The body of my HTTP request to https://graph.microsoft.com/v1.0/$batch is an array of PSCustomObjects that look like this:
id : 1
method : POST
url : sites/8c3cb1ef-4116-b0e4-6d0b-25d0f333a4ed/lists/a2b2d34e-6d32-df22-d562-472d3d8385d2/items
body : {
"fields": {
"DisplayName": "user1#contoso.com",
"CreatedDateTime": "2019-10-13",
"UserId": "c963d785-59fc-4384-5e7d-d466=2118e3347",
"UserType": "Guest",
}
}
headers : {
"Content-Type": "application/json"
}
I found it odd that when I omit headers, the payload is seemingly OK, and it sees that Content-Type has not been supplied. Once I add it in, then suddenly payload is not OK.
If I individually perform the requests, all the same data, it's also fine (items create in SPO list without issue). I feel like this fact confirms that the issue is not with the fields in the body.
I have no issues when batching GET requests (e.g. batching 100's of requests for auditLogs/signIns). These are essentially the same payload, minus no body/headers, so just id, url, and method.
Has anyone had any experienced this and found a solution?
PS. The Guid's are all fakes.
My issue ended up being that I needed to use ConvertTo-Json's -Depth parameter to successfully capture all of my body's content. It was 5 levels deep (Requests: [ Each Request: { body: { fields: { field names/values } } } ]).
This was only an issue with POST method requests, since a body is then needed, which in this case was for adding SPO list items, which requires the fields: key with children names and values for the list's columns / cells.

Solr do not split JSON

I'm using Solr v8.0.0 and im trying to split a json at the indexing time using the method described at the solr official documentation about Transforming JSON, but it is not working as expected, and I'm getting flat jsons at the end.
Here is how I'm doing it:
First I create a single core named C2
bin/solr create_core -c c2
Then, it's solrconfig.xml is automatically created and left as default.
then I try to index some data using the example URL. the only difference is that I added a ?commit=true to the end of the URL so we can see whats happening
curl 'http://localhost:8983/solr/c2/update/json/docs'\
'?commit=true'\
'?split=/'\
'&f=first:/first'\
'&f=last:/last'\
'&f=grade:/grade'\
'&f=subject:/exams/subject'\
'&f=test:/exams/test'\
'&f=marks:/exams/marks'\
-H 'Content-type:application/json' -d '
{
"first": "John",
"last": "Doe",
"grade": 8,
"exams": [
{
"subject": "Maths",
"test" : "term1",
"marks" : 90},
{
"subject": "Biology",
"test" : "term1",
"marks" : 86}
]
}'
but at the end, I got this kind of indexing, and not the one that was shown in the example:
What I Got:
{
{
"first":["John"],
"last":["Doe"],
"grade":[8],
"subject":["Maths",
"Biology"],
"test":["term1",
"term1"],
"marks":[90,
86],
"id":"284878be-1339-43b5-8a1e-adb7a4be95fb",
"_version_":1664059760532520960}]
}
What I was supposed to get:
{
"first":"John",
"last":"Doe",
"marks":90,
"test":"term1",
"subject":"Maths",
"grade":8
}
{
"first":"John",
"last":"Doe",
"marks":86,
"test":"term1",
"subject":"Biology",
"grade":8
}
My fields where flatten as it would usually do in a normal indexation, without the ?split=/ command in the url. Can anyone help me figure out why this behavior is happening?
Thanks.
No, that's not the only difference. In your request you have:
'?split=/'\
In the example from the manual it is:
'?split=/exams'\
And since you're not splitting on /exams in your request, the result differs.

Groovy property not recognized in ElasticSearch Watcher Transform Script getting proper JSON

I'm using:
Elasticsearch 2.3
Watcher
Topbeat
The goal is to create a watch that every x amount of time does a query and retrieves some hits, and post it to a web server. This works fine. However, the Json response in {{ctx.payload.hits.hits}} isn't Json, so I can't proccess it. The same issue seems to appear in some threads, this being the most similar to mine:
So, this is my watch (the input works fine, the issue is in the script of the action):
PUT _watcher/watch/running_process_watch
{
"trigger" : {
"schedule" : {
"interval" : "10s"
}
},
"input" : {... },
"actions" : {
"ping_webhook": {
"transform":{
"script": "return [ body: groovy.json.JsonOutput.toJson(ctx.payload.hits.hits)]"
},
"webhook": {
"method": "POST",
"host": "localhost",
"port": 4567,
"path": "/register_data",
"headers": {
"Content-Type" : "application/json"
},
"body" : "data: {{ctx.payload.body}}"
}
}
}
}
The error:
failed to execute [script] transform for [running_process_watch_0-2016-06-08T17:25:14.162Z]
ScriptException[failed to run inline script [return [ body: groovy.json.JsonOutput.toJson(ctx.payload.hits.hits)]] using lang [groovy]]; nested: MissingPropertyException[No such property: groovy for class: 1605d064acb49c10c464b655dacc9193f4e2e484];
at org.elasticsearch.script.groovy.GroovyScriptEngineService$GroovyScript.run(GroovyScriptEngineService.java:320)
at org.elasticsearch.watcher.transform.script.ExecutableScriptTransform.doExecute(ExecutableScriptTransform.java:74)
at org.elasticsearch.watcher.transform.script.ExecutableScriptTransform.execute(ExecutableScriptTransform.java:60)
at org.elasticsearch.watcher.transform.script.ExecutableScriptTransform.execute(ExecutableScriptTransform.java:41)
at org.elasticsearch.watcher.actions.ActionWrapper.execute(ActionWrapper.java:94)
at org.elasticsearch.watcher.execution.ExecutionService.executeInner(ExecutionService.java:388)
at org.elasticsearch.watcher.execution.ExecutionService.execute(ExecutionService.java:273)
at org.elasticsearch.watcher.execution.ExecutionService$WatchExecutionTask.run(ExecutionService.java:438)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Any idea to make groovy.json usable inside the watcher action script? Or any other idea to return proper json from the ctx.hits.hits?
So, I opened an issue in Elasticsearch repo. After some discussion, a native toJson function is going to be implemented in the mustache templating engine so it renders json by default.
Here is the pull request.
Hopefully in the next release will be ready.

Get back empty array from MongoDB using curl although the database is filled up with data

I am following the tutorial : https://thinkster.io/mean-stack-tutorial and adapting it to a side project to get a better understanding of how the mean stack works.
I have created a data base and imported a json into it using mongoimport : mongoimport --db cmt2 --collection Course --type json --file myFile.json --jsonArray
I have declared my database using : mongoose.connect('mongodb://localhost/cmt2');
When I curl my server after starting it with npm start : curl http://localhost:3000/courses I only get an empty array in return ( [] ).
But I am sure I have imported the data into the db because when I go to mongo shell and open the cmt2 database and use db.Course.find() I get back all my data.
My get route for courses is also probably correct since it works when I define another db I used for testing before I used cmt2 : mongoose.connect('mongodb://localhost/cmt');
It's probably something stupid about collection I don't understand because I name them randomly and I have no idea where I actually declare which collection I want to work with in my code.
For reference, here is the code I use for my get request :
router.get('/courses', function(req, res, next) {
Course.find(function(err, courses){
if(err){ return next(err); }
res.json(courses);
});
});
And an example of the data in the json I have imported in the db :
{
"code": "ABCD ",
"name": "ABCDE",
"list": [
{
"code": "ABCDEF ",
"name": "ABCDEFG"
},
{
"code": "BCDEF ",
"name": "BCDEFG"
}]
}
EDIT :
The major difference between cmt (the database I can curl) and cmt2 (the one i can't curl) is that I used embedded models with cmt and didn't with cmt2
I was importing to the wrong localhost.

Node.js SOAP client parameter formatting

I'm having trouble properly formatting one particular soap parameter using the node-soap module for node.js as a client, to a 3rd-party SOAP service.
The client.describe() for this method says this particular input should be in the shape of:
params: { 'param[]': {} }
I have tried a bunch of different JSON notations to try to fit my data to that shape.
Examples of formats that do NOT work:
"params": { "param": [ {"myParameterName": "myParameterValue"} ] }
"params": [ "param": { "name": "myParameterName", "_": "myParameterValue"} ]
"params": { "param" : [ {"name": "myParameterName", "_": "myParameterValue"} ] }
"params": { "param[]": {"myParameterName": "myParameterValue" } }
"params": { "param[myParameterName]": {"_": "myParameterValue" } }
I must be overlooking something, and I suspect I'm going to feel like Captain Obvious when some nice person points out what I'm doing wrong.
Here is what DOES work, using other soap clients, and how they handle the "named parameter with a value"
soapUI for this method successfully accepts this particular input via XML in the shape of:
<ns:params>
<ns:param name="myParameterName">myParameterValue</ns:param>
</ns:params>
Also, using PHP, I can successfully make the call by creating a stdClass of arrays like so:
$parms = new stdClass;
$parms->param = array(
array(
"name"=>"myParameterName","_"=>"myParameterValue"
)
);
and then eventually passing
'params' => $parms
to the PHP soap client
Many thanks!
To get a better look at what XML was being generated by node-soap, I added a console.log(message) statement to the node_modules/soap/lib/client.js after the object-to-XML encoding. I then began experimenting with various JSON structures to figure out empirically how they were mapping to XML structures.
I found a JSON structure for node-soap to generate the XML in my 3rd-party's required named-parameter-with-value format. I was completely unaware of the "$value" special keyword. Looks like this may have been added in the 0.4.6 release from mid-June 2014. See the change history
"params": [
{
"param": {
"attributes": {
"name": "myParameterName"
},
$value: "myParameterValue"
}
}
]
(note the outer array, which gives me the luxury of specifying multiple "param" entries, which is sometimes needed by this particular 3rd-party API)
generates this XML:
<tns:params>
<tns:param name="myParameterName">myParameterValue</tns:param>
</tns:params>
which perfectly matches the structure in soapUI (which I already knew worked) of:
<ns:params>
<ns:param name="myParameterName">myParameterValue</ns:param>
</ns:params>