Select value from array of objects Cosmos DB - json

I have the following json
{
"car-id": "54-38ncv",
"cars": [
{
"name": "Ferrari",
"horse-powers": 400
},
{
"name": "BMW",
"horse-powers": 200
},
{
"name": "Audi",
"horse-powers": 145
}]
}
the id is custom set by me. Imagine that there are hundreds other documents in my azure cosmos db collection. I want to create query that selects the first document that has a car in the cars that is named etc. "Ferrari". I know that they are maybe a dublicates, but is want the first one. Is there a quarry for this.

You can do this,
SELECT TOP 1 FROM c JOIN cc IN c.cars WHERE cc.name IN ("Ferrari")

Related

JSON query matching

For the given input JSON:
{
"person": {
"name": "John",
"age": 25
},
"status": {
"title": "assigned",
"type": 3
}
}
I need to build a string query that I could use to answer if the given JSON matches it or not. For example if the given person's name is "John" and his age is in the range of 20..30 and his status is not 4.
I need the query to be presented as a string and a commonly known library that can run it. I need it on multiple platforms (iOS, Android, Xamarin). I've tried JSON Path and JSON schemas, but didn't really figure out if it's able to achieve that with them. JSON Path seems to be specified on finding a single value in the JSON by a certain condition and JSON Schema mostly checks for the data structure and types.
Ok, found the solution. If I format the whole input object as a single-element array like that:
[
{
"person": {
"name": "John",
"age": 25
},
"status": {
"title": "assigned",
"type": 3
}
}
]
That will allow me to use JSON Path expressions:
$[?(#.person.name == 'John' && #.person.age >= 20 && #.status.type != 4)]
Basically if it doesn't match there won't be a result.

Extracting multiple values having same path in json using json map in sas

Dose anyone can help me get multiple values in a json having same path using a json map. Any help is appreciated. Thank you.
JSON
{
"totalCount": 2,
"facets": {},
"content": [
[
{
"name": "customer_ID",
"value": "1"
},
{
"name": "customer_name",
"value": "John"
}
]
]
}
JSON MAP
{
"DATASETS": [
{
"DSNAME": "customers",
"TABLEPATH": "/root/content",
"VARIABLES": [
{
"NAME": "name",
"TYPE": "CHARACTER",
"PATH": "/root/content/name"/*output as customer_ID*/
},
{
"NAME": "name",
"TYPE": "CHARACTER",
"PATH": "/root/content/name"/*output as customer_name*/
},
{
"NAME": "value",
"TYPE": "CHARACTER",
"PATH": "/root/content/value"/*output as 1*/
},
{
"NAME": "value",
"TYPE": "CHARACTER",
"PATH": "/root/content/value"/*output as John*/
}
]
}
]
}
When i use the above json map I get the output for name as only "customer_name", but i need both "customer_ID" and "customer_name" in the output.
Similarly i need both values of "value"
JSON is a hierarchy of name-value pairs. The JSON engine in SAS will take the "name" and assign as a variable name, and then populate with the value. In your JSON, there are two sets of name-values, one being the name of an intended variable, and another being its value. This is a common output scheme we find in GraphQL responses -- and these require a little manipulation to turn into 2-D data sets.
For your example, you could use PROC TRANSPOSE:
libname j json fileref=test;
proc transpose
data=j.content
out=want;
id name;
var value;
run;
Output:
customer_ customer_
Obs _NAME_ ID name
1 value 1 John
You can also do more manual seek/assignment by using DATA step to process what you see in the ALLDATA member in the JSON libname. In your example, SAS sees that as:
Obs P P1 P2 V Value
1 1 totalCount 1 2
2 1 facets 0
3 1 content 0
4 1 content 0
5 2 content name 1 customer_ID
6 2 content value 1 1
7 1 content 0
8 2 content name 1 customer_name
9 2 content value 1 John
Processing the ALLDATA member is not as friendly as using the relational data that the JSON engine can create, but I find with GraphQL responses that's what you need to do to get more control over the name, length, and type/format for output variables.

How to retrieve multiple data from jsonb column in postgres?

In PostgreSQL database I have a json column called json. Data inside look like below:
{
"Version": "0.0.0.1",
"Items": [
{
"Id": "40000000-0000-0000-0000-000000141146",
"Name": "apple",
"Score": 64,
"Value": 1430000
},
{
"Id": "40000000-0000-0000-0000-000000141147",
"Name": "grapefruit",
"Score": 58,
"Value": 1190000
},
{
"Id": "40000000-0000-0000-0000-000000141148",
"Name": "mango",
"Score": 41,
"Value": 170000
}
]
}
What I would like to do is retrieving all Score data from Items elements.
I was trying to use SQL code:
select
substring(json ->> 'Items' from '"Score": (\d*),') as score
from vegetables;
However that returns just the score from first element instead of 3. I was trying to use '\g' flag which supposed to find all results globally, but the code was not working.
Could anyone advise how to do that properly? Thanks in advance!
Considering that the data type of json field is jsonb then no need to use substring or regex, simply lateral join with jsonb_array_elements will do the required things for you. Try below query.
select x->>'Score' "Score" from vegetables
cross join lateral jsonb_array_elements(json->'Items') x
DEMO

Pentaho Kettle: How to dynamically fetch JSON file columns

Background: I work for a company that basically sells passes. Every order that is placed by the customer will contain N number of passes.
Issue: I have these JSON event-transaction files coming into a S3 bucket on a daily basis from DocumentDB (MongoDB). This JSON file is associated to the relevant type of event (insert, modify or delete) for every document key (which is an order in my case). The example below illustrates a "Insert" type of event that came through to the S3 bucket:
{
"_id": {
"_data": "11111111111111"
},
"operationType": "insert",
"clusterTime": {
"$timestamp": {
"t": 11111111,
"i": 1
}
},
"ns": {
"db": "abc",
"coll": "abc"
},
"documentKey": {
"_id": {
"$uuid": "abcabcabcabcabcabc"
}
},
"fullDocument": {
"_id": {
"$uuid": "abcabcabcabcabcabc"
},
"orderNumber": "1234567",
"externalOrderId": "12345678",
"orderDateTime": "2020-09-11T08:06:26Z[UTC]",
"attraction": "abc",
"entryDate": {
"$date": 2020-09-13
},
"entryTime": {
"$date": 04000000
},
"requestId": "abc",
"ticketUrl": "abc",
"tickets": [
{
"passId": "1111111",
"externalTicketId": "1234567"
},
{
"passId": "222222222",
"externalTicketId": "122442492"
}
],
"_class": "abc"
}
}
As we see above, every JSON file might contain N number of passes and every pass is - in turn - is associated to an external ticket id, which is a different column (as seen above). I want to use Pentaho Kettle to read these JSON files and load the data into the DW. I am aware of the Json input step and Row Normalizer that could then transpose "PassID 1", "PassID 2", "PassID 3"..."PassID N" columns into 1 unique column "Pass" and I would have to have to apply a similar logic to the other column "External ticket id". The problem with that approach is that it is quite static, as in, I need to "tell" Pentaho how many Passes are coming in advance in the Json input step. However what if tomorrow I have an order with 10 different passes? How can I do this dynamically to ensure the job will not break?
If you want a tabular output like
TicketUrl Pass ExternalTicketID
---------- ------ ----------------
abc PassID1Value1 ExTicketIDvalue1
abc PassID1Value2 ExTicketIDvalue2
abc PassID1Value3 ExTicketIDvalue3
And make incoming value dynamic based on JSON input file values, then you can download this transformation Updated Link
I found everything work dynamic in JSON input.

Obtain a different JSON object structure in AngularJS

I'm Working on AngularJS.
In this part of the project my goal is to obtain a JSON structure after filling a form with some particulars values.
Here's the fiddle of my simple form: Fiddle
With the form I will do a query to KairosDB, that is my NoSql Database, I will query data from it by a JSON object. The form is structured in this way:
a Name
a certain Number of Tags, with Tag Id ("ch" for example) and tag value ("932" for example)
a certain Number of Aggregators to manipulate data coming from DB
Start Timestamp and End Timestamp (now they are static and only included in the final JSON Object)
After filling this form, with my code I'll obtain for example this JSON object:
{
"metrics": [
{
"tags": [
{
"id": "ch",
"value": "932"
},
{
"id": "ch",
"value": "931"
}
],
"aggregators": {
"name": "sum",
"sampling": [
{
"value": "1",
"unit": "milliseconds",
"type": "SUM"
}
]
}
}
],
"cache_time": 0,
"start_absolute": 123,
"end_absolute": 1234
}
Unfortunately, KairosDB accepts a different structure, and as you could see, Tag id "ch" doesn't hase an "id" string before, or for example, Tag values coming from the same tag id are grouped together
{
"metrics": [
{
"tags": {
"ch": [
"932",
"931"
]
},
"name": "AIENR",
"aggregators": [
{
"name": "sum",
"sampling": {
"value": "1",
"unit": "milliseconds"
}
}
]
}
],
"cache_time": 0,
"start_absolute": 1367359200000,
"end_absolute": 1386025200000
}
My question is: Is there a way to obtain the JSON structure like the one accepted by Kairos DB with an Angular JS form?. Thanks to everyone.
I've seen this topic as the one more similar to mine but it isn't in AngularJS.
Personally, I'd do the refactoring work in the backend - Have what ever server interfaces sends and receives data do the manipulation - Otherwise you'll end up needing to refactor your data inside Angular anywhere you want to use that dataset.
Where as doing it in the backend would put it in a single access point.
Of course, you could do it in Angular, just replace userString in the submitData method with a copy of the array and replace the tags section with data in the new format, and likewise refactor the returned result to the correct format when you get a reply.