I have the below JSON data and I am required to print it as a table. I have managed to come up with below query using jq in BASH to print the data in TOTAL section but unable to get headers including AN,BN part(whats the technical term?) of JSON. Total is equal to sum of sub values in it.
required format for output:
Name Total
-----------------------------------
AN xxxxxxx
BN xxxxxxx
my current command:
curl -s 'https://url.json' | jq '.[] | .total | "\(.confirmed)-\(.deceased)-\(.recovered)-\(.tested)"'
DATA:
> { "AN": {
> "delta7": {
> "confirmed": 238,
> "deceased": 2,
> "recovered": 199,
> "tested": 9953,
> "vaccinated": 24243
> },
> "districts": {
> "Unknown": {
> "delta7": {
> "confirmed": 238,
> "deceased": 2,
> "recovered": 199,
> "tested": 9953
> },
> "meta": {
> "tested": {
> "last_updated": "2021-04-21",
> "source": "https://dhs.andaman.gov.in/NewEvents/642.pdf"
> }
> },
> "total": {
> "confirmed": 5527,
> "deceased": 65,
> "recovered": 5309,
> "tested": 357442
> }
> }
> },
> "meta": {
> "last_updated": "2021-04-23T00:10:19+05:30",
> "population": 397000,
> "tested": {
> "last_updated": "2021-04-21",
> "source": "https://dhs.andaman.gov.in/NewEvents/642.pdf"
> }
> },
> "total": {
> "confirmed": 5527,
> "deceased": 65,
> "recovered": 5309,
> "tested": 357442,
> "vaccinated": 91977
> } }, {
"BN": {
"delta7": {
"confirmed": 238,
"deceased": 2,
"recovered": 199,
"tested": 9953,
"vaccinated": 24243
},
"districts": {
"Unknown": {
"delta7": {
"confirmed": 238,
"deceased": 2,
"recovered": 199,
"tested": 9953
},
"meta": {
"tested": {
"last_updated": "2021-04-21",
"source": "https://dhs.andaman.gov.in/NewEvents/642.pdf"
}
},
"total": {
"confirmed": 5527,
"deceased": 65,
"recovered": 5309,
"tested": 357442
}
}
},
"meta": {
"last_updated": "2021-04-23T00:10:19+05:30",
"population": 397000,
"tested": {
"last_updated": "2021-04-21",
"source": "https://dhs.andaman.gov.in/NewEvents/642.pdf"
}
},
"total": {
"confirmed": 5527,
"deceased": 65,
"recovered": 5309,
"tested": 357442,
"vaccinated": 91977
} } }
With your input (once it has been corrected):
jq -r 'to_entries[] |
[.key,
(.value | .total | "\(.confirmed)-\(.deceased)-\(.recovered)-\(.tested)")]|#tsv' input.json
AN 5527-65-5309-357442
BN 5527-65-5309-357442
Related
I have data in JSON format, the sample is given below. I want to calculate the min and max range for the values inside the variable i.e. align_X, align_Y, align_Z.
{"id": 0, "variable": {"align_X": 41, "align_Y": 51, "align_Z": 80}}
{"id": 1, "variable": {"align_X": 0}}
{"id": 2, "variable": {"align_Y": 1, "align_Z": 0}}
Desired output is:
"align_X": [
0.0,
41.0
],
"align_Y": [
1.0,
51.0
],
"align_Z": [
0.0,
80.0
]
Any help is appropriated.
Thank you
Can you try this
var test = [{
"id": 0,
"variable": {
"align_X": 41,
"align_Y": 51,
"align_Z": 80
}
}, {
"id": 1,
"variable": {
"align_X": 0
}
}, {
"id": 2,
"variable": {
"align_Y": 1,
"align_Z": 0
}
}
]
var result = test.reduce((acc, cobj) => {
for (key in cobj.variable) {
if (acc[key]) {
if(acc[key][0] > cobj.variable[key]){
if(!acc[key][1]){
acc[key][1] = acc[key][0];
}
acc[key][0] = parseFloat(cobj.variable[key]).toFixed(2)
} else if(acc[key][1] == undefined || acc[key][1] < cobj.variable[key]){
acc[key][1] = parseFloat(cobj.variable[key]).toFixed(2)
}
}
else {
acc[key] = [];
acc[key].push(parseFloat(cobj.variable[key]).toFixed(2))
}
}
return acc;
}, []);
console.log(result);
My solution
#Your input
d = [{"id": 0, "variable": {"align_X": 41, "align_Y": 51, "align_Z": 80}},
{"id": 1, "variable": {"align_X": 0}},
{"id": 2, "variable": {"align_Y": 1, "align_Z": 0}}
]
#To store result float(-inf) smallest number in python & float(inf) is largest
output = {
"align_X" : [float('inf'),float('-inf')],
"align_Y" : [float('inf'),float('-inf')],
"align_Z" : [float('inf'),float('-inf')]
}
for item in d:
for each in output.keys():
try:
if item['variable'][each]<output[each][0]:
output[each][0] = item['variable'][each]
if item['variable'][each]>output[each][1]:
output[each][1] = item['variable'][each]
except:
continue
print(output)
I have a scenario:
I want to connect to my backend apis by providing the api endpoints as path.
For eg. apis would look like following
/Measure/Test/Calories?q=*
/Measure/Test/Weight
/Food/Test/IntakeAmount/
/v1/Food/Test/Summary
Though, when I provide the absolute path to api endpoints it do work but providing the endpoints from url template parameters it throws error of 404 Not Found
Also when I check the trace the inbound request is not able to find the operation:
> api-inspector (0.008 ms) {
> "configuration": {
> "api": {
> "from": "/testapi",
> "to": {
> "scheme": "http",
> "host": "dev-foodmeasures-summary.com",
> "port": 80,
> "path": "/",
> "queryString": "",
> "query": {},
> "isDefaultPort": true
> },
> "version": null,
> "revision": "1"
> },
> **"operation": "-"**,
> "user": {
> "id": "1",
> "groups": [
> "Administrators",
> "Developers"
> ]
> },
> "product": {
> "id": "unlimited"
> }
> } }
Below is the snapshot for path parameter
Thanks!
I want to be able to fetch documents over a certain date range as provided by the user and the display the document to be viewed. The query that i have currently constructed looks like this :
query
{
"aggs": {
"range": {
"date_range": {
"field": "metadata.o2r.temporal.begin",
"ranges": [
{ "from": "2017-03-30T12:35:41.142Z", "to": "2017-08-02T22:00:00.000Z", "key": "quarter_01" }
],
"keyed": true
}
}
}
}
'
The temporal part of the json documents that i am trying to fetch are like the following:
JSON
"temporal": {
"begin": "2017-08-01T22:00:00.000Z",
"end": "2017-03-30T12:35:41.142Z"
},
Currently i can either query "begin" or "end" but i want to be able to modify the query in such a way that begin becomes a value for "from" and end becomes a values for "to". The catch is here that i do not want my original JSON to be modified.
Updated Query
curl -XGET 'localhost:9201/test/_search?size=0&pretty' -H 'Content-Type: application/json' -d'
> {
> "query": {
> "bool": {
> "must": [
> {
> "range": {
> "metadata.o2r.temporal.begin": {
> "from": "2016-01-01T12:35:41.142Z"
> }
> }
> } ,
> {
> "range": {
> "metadata.o2r.temporal.end": {
> "to": "2016-12-30T22:00:00.000Z"
> }
> }
> }
> ]
> }
> }
> }
> '
Response
{
"took" : 1678,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 1,
"max_score" : 0.0,
"hits" : [ ]
}
}
This might help you
{
"query": {
"bool": {
"must": [
{
"range": {
"metadata.o2r.temporal.begin": {
"from": "2017-03-30T12:35:41.142Z"
}
}
} ,
{
"range": {
"metadata.o2r.temporal.end": {
"to": "2017-08-02T22:00:00.000Z"
}
}
}
]
}
}
}
when I am trying to comprise a compound bool query that has a fuzzy must requirement and several should reqs with one being a wildcard, I run into this error message. So far, no alterations to the syntax have helped me to resolve this issue.
The query:
{
"query": {
"bool": {
"must": {
"fuzzy": {
"message": "<fuzzy string>",
"fuzziness": "auto"
}
},
"should": [
{ "query": { "message": "<string>" } },
{ "query": { "message": "<string>" } },
{ "wildcard":
{
"query": { "message": "<partial string*>"}
}
}
],
"minimum_should_match": "50%"
}
}
}
The text inside <> is replaced with my searched string.
You need to replace query with match in your bool/should clause:
> { "query": {
> "bool": {
> "must": {
> "fuzzy": {
> "message": "<fuzzy string>",
> "fuzziness": "auto"
> }
> },
> "should": [
> {"match": {"message": "<string>"}}, <-- here
> {"match": {"message": "<string>"}}, <-- and here
> {"wildcard": {"query": {"message": "<partial string*>"}}}
> ],
> "minimum_should_match": "50%"
> } } }
I am having an issue deserializing from a stream in node (specifically the pricing feed from the Bitcoin GOX exchange). Basically a chunk arrives which is well formed complete and verified JSON. Here is the code:
var gox = require('goxstream');
var fs = require('fs');
var options = {
currency: 'AUD',
ticker: true,
depth: false
};
var goxStream = gox.createStream(options);
goxStream.on('data', function(chunk) {
console.log(JSON.parse(chunk));
});
When trying to parse it I get the following
undefined:0
^
SyntaxError: Unexpected end of input
Any ideas? I have included a sample chunk:
> {"channel": "eb6aaa11-99d0-4f64-9e8c-1140872a423d", "channel_name":
> "ticker.BTCAUD", "op": "private", "origin": "broadcast", "private":
> "ticker", "ticker": {
> "high": {
> "value": "121.51941",
> "value_int": "12151941",
> "display": "AU$121.51941",
> "display_short": "AU$121.52",
> "currency": "AUD"
> },
> "low": {
> "value": "118.00001",
> "value_int": "11800001",
> "display": "AU$118.00001",
> "display_short": "AU$118.00",
> "currency": "AUD"
> },
> "avg": {
> "value": "119.58084",
> "value_int": "11958084",
> "display": "AU$119.58084",
> "display_short": "AU$119.58",
> "currency": "AUD"
> },
> "vwap": {
> "value": "119.80280",
> "value_int": "11980280",
> "display": "AU$119.80280",
> "display_short": "AU$119.80",
> "currency": "AUD"
> },
> "vol": {
> "value": "249.73550646",
> "value_int": "24973550646",
> "display": "249.73550646\u00a0BTC",
> "display_short": "249.74\u00a0BTC",
> "currency": "BTC"
> },
> "last_local": {
> "value": "118.50000",
> "value_int": "11850000",
> "display": "AU$118.50000",
> "display_short": "AU$118.50",
> "currency": "AUD"
> },
> "last_orig": {
> "value": "108.99500",
> "value_int": "10899500",
> "display": "$108.99500",
> "display_short": "$109.00",
> "currency": "USD"
> },
> "last_all": {
> "value": "118.79965",
> "value_int": "11879965",
> "display": "AU$118.79965",
> "display_short": "AU$118.80",
> "currency": "AUD"
> },
> "last": {
> "value": "118.50000",
> "value_int": "11850000",
> "display": "AU$118.50000",
> "display_short": "AU$118.50",
> "currency": "AUD"
> },
> "buy": {
> "value": "118.50000",
> "value_int": "11850000",
> "display": "AU$118.50000",
> "display_short": "AU$118.50",
> "currency": "AUD"
> },
> "sell": {
> "value": "119.99939",
> "value_int": "11999939",
> "display": "AU$119.99939",
> "display_short": "AU$120.00",
> "currency": "AUD"
> },
> "item": "BTC",
> "now": "1376715241731341" }}
You can verify it here: http://jsonlint.com
Also it is probably worth mentioning I have already tried parsing and removing the escaped characters. Also have tried a couple of different serializers with the same results
You are getting the data chunk by chunk. Chunks themselves may not be complete JSON objects. Either buffer all of the data, or use something to do it for you (say the request module), or if you need to parse a long stream take a look at the JSONparse module.
You are getting two separate chunks (or at least: that's what I am getting when re-creating your issue). One (the first) is a valid JSON object, while the other (the second) is "almost empty": it is a 1-byte string containing just an LF (ASCII 0x0a).
The second one fails parsing, of course.
Read my first answer: this is exactly such a case. If you concat the two chunks together you get a complete JSON object with a trailing LF, easily passing JSON.parse(). If you try to parse the chunks separately, though, however, the first one succeeds (a trailing LF is not a mandatory) while the second one fails (an LF by itself is not a valid JSON object).
For your case, you would have to:
1) Either assume Mt.Gox always sends data "this way", ignore those "almost empty" chunks, and parse only the "non empty" chunks.
2) Or use JSONparse which parses JSON streams.