I'm trying to send time stamped key value pairs to the ThingsBoard Demo platform (demo.thingsboard.io). The standard way is to send a timestamp and with some key-value-pairs like so:
{"ts":1451649600512, "values":{"key1":"value1", "key2":"value2"}}
My problem is, that I need to process up to 100 acceleration measurements per second and I dont want to send a http post for every x-y-z value-package. Is there a way to send one json body with, lets say, 100 timestamps with corresponding measurements?
I tried it:
{
"ts": 1508695100,
"values": {
"key1": 34,
"key2": 26
},
"ts": 1508695200,
"values": {
"key1": 38,
"key2": 29
}
}
There is no error message when pushing this json to ThingsBoard with curl, but only the last timestamp-value-block seems to be recognized by ThingsBoard.
Any suggestions on how to to solve my problem?
You should use following format (json array):
[{"ts":1451649600512, "values":{"key1":"value1", "key2":"value2"}}, {"ts":1451649600513, "values":{"key1":"value1", "key2":"value2"}}]
or
[
{
"ts":1451649600512,
"values":{
"key1":"value1",
"key2":"value2"
}
},
{
"ts":1451649600513,
"values":{
"key1":"value1",
"key2":"value2"
}
}
]
BTW, the JSON you have tried is not a valid JSON document at all. Please check the validity of the document before sending.
Related
Is there any way to access properties of an object that was transformed into JSON through the jp (JSON parse) filter of Dust.js?
{
"response": {
"services": [
"{
\"prop1\":\"value1\",
\"prop2\":\"value2\",
\"prop3\":10
}"
]
}
}
For example, with the input above, I intend to receive the following output:
[
{
"prop1": "value1"
}
]
Note that the values inside the service array are strings, and because of that, before accessing the object's properties, I need to run JSON parse filter.
[
{#response.services}
{
"prop1": "{.|jp}"
}{#sep}, {/sep}
{/response.services}
]
What I've developed so far is the code above, and this code is returning the following output:
[
{
"prop1": "[object Object]"
}
]
In short, what I need to do is increment this {.|jp} into something where I can access the properties of the returned object, without adding new filters.
Thanks in advance to everyone who is willing to help!
I have a json file temp.json like this -
{
"data": {
"stuff": [
.....
]
},
"time": {
"metrics": 83
}
}
I want to remove this particular block of code from the above json file -
,
"time": {
"metrics": 83
}
After removal I want to rewrite new json in the same file so that new content in the same file will be -
{
"data": {
"stuff": [
.....
]
}
}
Is this possible to do by any chance?
Note: number 83 can be any number in general.
Here's an excellent tutorial: Baeldung: Guide to Linux jq Command for JSON Processing.
Maybe you can try something like this: jq 'del(.time)' temp.json > temp2.json.
Note that jq works at the semantic level; it's not just "text substitution". So things like the "comma" separators between objects will be deleted from the JSON text when you use jq to delete the object.
Experiment, and see what works best for your particular scenario.
I have following flow in NIFI , JSON has (1000+) objects in it.
invokeHTTP->SPLIT JSON->putMongo
Flow works fine, till I receive some keys in json with "." in the name. e.g. "spark.databricks.acl.dfAclsEnabled".
my current solution is not optimal, I have jotted down bad keys, and using multiple replace text processor to replace "." with "_". I am not using REGEX, I am using string literal find/replace. So each time I am getting failure in putMongo processor, I am inserting new replaceText processor.
This is not maintainable. I am wondering if I can use JOLT for this? couple of info regarding input JSON.
1) no set structure, only thing that is confirmed is. everything will be in events array. But event object itself is free form.
2) maximum list size = 1000.
3) 3rd party JSON, so I cant ask for change in format.
Also, key with ".", can appear anywhere. So I am looking for JOLT spec that can cleanse at all level and then rename it.
{
"events": [
{
"cluster_id": "0717-035521-puny598",
"timestamp": 1531896847915,
"type": "EDITED",
"details": {
"previous_attributes": {
"cluster_name": "Kylo",
"spark_version": "4.1.x-scala2.11",
"spark_conf": {
"spark.databricks.acl.dfAclsEnabled": "true",
"spark.databricks.repl.allowedLanguages": "python,sql"
},
"node_type_id": "Standard_DS3_v2",
"driver_node_type_id": "Standard_DS3_v2",
"autotermination_minutes": 10,
"enable_elastic_disk": true,
"cluster_source": "UI"
},
"attributes": {
"cluster_name": "Kylo",
"spark_version": "4.1.x-scala2.11",
"node_type_id": "Standard_DS3_v2",
"driver_node_type_id": "Standard_DS3_v2",
"autotermination_minutes": 10,
"enable_elastic_disk": true,
"cluster_source": "UI"
},
"previous_cluster_size": {
"autoscale": {
"min_workers": 1,
"max_workers": 8
}
},
"cluster_size": {
"autoscale": {
"min_workers": 1,
"max_workers": 8
}
},
"user": ""
}
},
{
"cluster_id": "0717-035521-puny598",
"timestamp": 1535540053785,
"type": "TERMINATING",
"details": {
"reason": {
"code": "INACTIVITY",
"parameters": {
"inactivity_duration_min": "15"
}
}
}
},
{
"cluster_id": "0717-035521-puny598",
"timestamp": 1535537117300,
"type": "EXPANDED_DISK",
"details": {
"previous_disk_size": 29454626816,
"disk_size": 136828809216,
"free_space": 17151311872,
"instance_id": "6cea5c332af94d7f85aff23e5d8cea37"
}
}
]
}
I created a template using ReplaceText and RouteOnContent to perform this task. The loop is required because the regex only replaces the first . in the JSON key on each pass. You might be able to refine this to perform all substitutions in a single pass, but after fuzzing the regex with the look-ahead and look-behind groups for a few minutes, re-routing was faster. I verified this works with the JSON you provided, and also JSON with the keys and values on different lines (: on either):
...
"spark_conf": {
"spark.databricks.acl.dfAclsEnabled":
"true",
"spark.databricks.repl.allowedLanguages"
: "python,sql"
},
...
You could also use an ExecuteScript processor with Groovy to ingest the JSON, quickly filter all JSON keys that contain ., perform a collect operation to do the replacement, and re-insert the keys in the JSON data if you want a single processor to do this in a single pass.
My Goal is to retrieve JSON type fields in an Solr index and also perform search queries on such fields.
I have the following documents in Solr Index and using the auto generated schema utilizing schemaless feature in Solr.
POST http://localhost:8983/solr/test1/update?commitWithin=1000
[
{"id" : "1", "type_s":"book", "title_t" : "The Way of Kings", "author_s" : "Brandon Sanderson",
"miscinfo": {"provider": "orielly", "site": "US"}
},
{"id" : "2", "type_s":"book", "title_t" : "The Game of Thrones", "author_s" : "James Sanderson",
"miscinfo": {"provider": "pacman", "site": "US"}
}
]
I see the JSON types are stored as strings in the schemaField type as seen in the output for following
GET http://localhost:8983/solr/test1/schema/fields
{
"name":"miscinfo",
"type":"strings"}
I had tried using srcField as mentioned in this post. However, a query to retrieve json type returns empty response. Below are the GET request used for the same
GET http://localhost:8983/solr/test1/select?q=1&fl=miscinfo&wt=json
GET http://localhost:8983/solr/test1/select?q=1&fl=miscinfo,source_s:[json]&wt=json
Also, the search queries for values inside JSON type fields return empty response
http://localhost:8983/solr/test1/select?q=pacman&wt=json
{
"responseHeader": {
"status": 0,
"QTime": 0,
"params": {
"q": "pacman",
"json": "",
"wt": "json"
}
},
"response": {
"numFound": 0,
"start": 0,
"docs": []
}
}
Please help in searching object types in Solr.
Have you checked this: https://cwiki.apache.org/confluence/display/solr/Response+Writers
JSON Response Writer A very commonly used Response Writer is the
JsonResponseWriter, which formats output in JavaScript Object Notation
(JSON), a lightweight data interchange format specified in specified
in RFC 4627. Setting the wt parameter to json invokes this Response
Writer. Here is a sample response for a simple query like
q=id:VS1GB400C3&wt=json:
I am thinking about using JSON as a way to communicate information inside my programm in a way suggested in this talk.
As i read the JavaScript Object Notation i see no way of noting my objecttype.
Suppose i communicate the string { "val" : 5 }, how would i know what that string was for.
I would like to send the string error = { "val" : 5 } and measurement = { "val" : 5 }. but as i read it this would not be valid JSON notation.
Is the solution always something like { "type" : "error", "val" : 5 } or am i missing some bigger concept in JavaScript Object Notation.
EDIT: ops - did not do JSON in my examples, fixed that
{
"type": "error",
"val": 5
}
That's the proper way to format your JSON
If you have different type of values, then you will be able to have an array looking like this:
[
{
"type": "error",
"val": 5
},
{
"type": "measurement",
"val": 45
}
]
In JSON (and Javascript in general), the key name identifies the type of the value. A JSON-like version of your examples:
{
'error': 5,
'measurement': 5
}