Proper Mapping for dynamic fields - json

I have the following document structure:
{
"some_field": "some_data",
"entries": {
{"id": "some_id", "type": "some_type", "value": "some_value"},
{"id": "another_id", "type": "another_type", "value": {"foo": 1, "bar": "two"}
}
}
So I would like to map entries based on the "type" field.
Which maping type or flag should I use?
Or maybe I need to change my doc structure?

Could you use this one
{
"some_field":"some_data",
"entries":[{
"id":"some_id",
"type":"some_type",
"value":"some_value"
},
{
"id":"another_id",
"type":"another_type",
"value":{
"foo":1,
"bar":"two"
}
}]
}

Related

Change subelement with jq

I have a structure that looks like so
[
[
{
"ID": "grp1-001",
},
{
"ID": "grp1-002",
},
{
"ID": "grp1-003",
},
{
"ID": "grp1-004",
},
{
"ID": "grp1-005",
},
{
"ID": "grp1-006",
}
],
[
{
"ID": "grp2-001",
},
{
"ID": "grp2-002",
},
{
"ID": "grp2-003",
},
{
"ID": "grp2-004",
},
{
"ID": "grp2-005",
},
{
"ID": "grp2-006",
}
.......
what I need to get as a result of the modification is this
[
[
["1", "grp1-001"],
["2", "grp1-002"],
["3", "grp1-003"],
["4", "grp1-004"],
["5", "grp1-005"],
["6", "grp1-006"],
],
[
["1", "grp2-001"],
["2", "grp2-002"],
["3", "grp2-003"],
["4", "grp2-004"],
["5", "grp2-005"],
["6", "grp2-006"],
],
Which means I need to keep the external structure (outside array and an internal grouping) but convert the inner dict to an array and replace the "ID" key with a value (that will come from external source like --argjson). I am not even sure how to start - any ideas/resources are highly appreciated.
Assuming you're just taking the objects and transforming them to pairs of the index in the array and the ID value, you could do this:
map([to_entries[] | [.key + 1, .value.ID | tostring]])
https://jqplay.org/s/RBac7SPfdG
Using to_entries/0 on an array gives you an array of key/value (index/value) pairs. You could then shift the indices by 1 and convert to strings.

Query Nested Mongodb info with Variable Nested Field names

I have a MongoDB that is structured as below:
[
{
"subject_id": "1",
"name": "Maria",
"dob": "1/1/00",
"gender": "F",
"visits": {
"1/1/18": {
"date_entered": "1/2/18",
"entered_by": "Sally"
},
"1/2/18": {
"date_entered": "1/2/18",
"entered_by": "Tim",
}
},
"samples": {
"XXX123": {
"collected_by": "Sally",
"collection_date": "1/3/18"
}
}
},
{
"subject_id": "2",
"name": "Bob",
"dob": "1/2/00",
"gender": "M",
"visits": {
"1/3/18": {
"date_entered": "1/4/18",
"entered_by": "Tim"
}
},
"samples": {
"YYY456": {
"collected_by": "Sally",
"collection_date": "1/5/18"
},
"ZZZ789": {
"collected_by": "Tim",
"collection_date": "1/6/18"
},
"AAA123": {
"collected_by": "Sally",
"collection_date": "1/7/18"
}
}
}
]
If I wanted to query the database to find all samples collected by Sally or all visits entered by Tim, what would be the best way of doing that?
I'm new to MongoDB and my attempts with various regex's haven't produced results. Any advice would be greatly appreciated.
I first used project on the required fields to use objectToArray followed by unwind to create separate records for array created in project.
The results are then filtered using match.
This works for the data provided in the question -
db.so.aggregate([
{$project: {visits: {$objectToArray: "$visits"}, samples: {$objectToArray: "$samples"}}},
{$unwind: "$visits"},
{$unwind: "$samples"},
{ $match: {
$or : [
{ "visits.v.entered_by" : "Tim" },
{ "samples.v.collected_by" : "Sally" }
]
}
}
])

Sending the JSON response as array or normal object

I am implementing a restful service where I am getting the pdf names and their ids from the database in the JSON fromat. Which one of the both the convenient JSON resful service response?
First Option:
{
"results": {
"documentNames": [
"test.pdf",
"ireport-ultimate-guide.pdf",
"sending report.pdf",
"Motor Hour.pdf"
],
"documentds": [
21116,
21117,
21118,
21119
]
}
}
Second Option:
{
"results": {
"21116": "test.pdf",
"21117": "ireport-ultimate-guide.pdf",
"21118": "sending report.pdf",
"21119": "Motor Hour.pdf"
}
}
I would use this "third option": The result is a list of object.
{
"result": [{
"id": "21116",
"filename": "test.pdf"
},
{
"id": "21117",
"filename": "ireport-ultimate-guide.pdf"
},
{
"id": "21118",
"filename": "sending report.pdf"
},
{
"id": "21119",
"filename": "Motor Hour.pdf"
}
]
}
because it better models the object structure.
I would create an entity for each document that contains both name and I'd.
[
{"name": "doc_1", "id": 123},
{"name": "doc_2", "id": 456}
]

How to Append JSON Object in already created object in mysql json document

My object is
{
"name":"Testing",
"id": "hcig_3fe7cb00-e936-11e6-af69-a748c8cc89ad",
"belongsTo": {
"id": "69616d26-c3bb-405c-8c84-c51c091524b2",
"name": "test"
},
"locatedAt": {
"id": "49616d26-c3bb-405c-8c84-c51c091524b2",
"name":"Test"
} }
I want to merge one more object like
"obj":[{
"a": 123
}}
With the help of JSON_MERGE in mysql document store i am able to add object.
But it looks likes
{
"name":"Tester",
"id": "hcig_3fe7cb00-e936-11e6-af69-a748c8cc89ad",
"belongsTo": {
"id": "69616d26-c3bb-405c-8c84-c51c091524b2",
"name": "test"
},
"locatedAt": {
"id": "49616d26-c3bb-405c-8c84-c51c091524b2",
"name":"Test"
},{
"obj":[{
"a": 123
}]
}}
I want my object to be as
{
"name": "Tester",
"id": "hcig_3fe7cb00-e936-11e6-af69-a748c8cc89ad",
"belongsTo": {
"id": "69616d26-c3bb-405c-8c84-c51c091524b2",
"name": "test"
},
"locatedAt": {
"id": "49616d26-c3bb-405c-8c84-c51c091524b2",
"name": "Test"
},
"obj": [{
"a": 123
}]}
Any idea on how to add object as above manner using JSON Functions in mysql ??
Use lodash for a recursive deep copy - https://lodash.com/
lodash.merge(targetObj, sourceObj);
Or if you have programmatic access:
targetObj.obj = sourceObj;

Elasticsearch: how to search a user_defined field?

I use mysql to store user data, and search the data by Elasticsearch. For mysql, I have a user defined field, this field can store a JSON format data.
the example data like this:
data1 =
{
"name" : "test1",
"age": 10,
"user_defined": {
"a" : "aaa",
"b" : "bbb",
"c" : "ccc",
.....
}
}
data2 =
{
"name" : "test2",
"age": 20,
"user_defined": {
"d" : "ddd",
"e" : "eee",
"f" : "fff",
.....
}
}
For user_defined field, the number of keys is not fixed, the type of values all are string, I hope each key can be searched, how to define the mapping? how to search this kind of data by Elasticsearch?
Anyone has good idea?
You can define the mapping of the user_defined as "type": "object", like this:
PUT your_index
{
"mappings": {
"your_type": {
"properties": {
"name": {
"type": "string"
},
"age": {
"type": "integer"
},
"user_defined": {
"type": "object"
}
}
}
}
}
Thereafter, you can index your documents and search them easily with the Query DSL
POST your_index/_search
{
"query": {
"match" : {
"user_defined.a" : "aaa"
}
}
}
Every field in a document in Elasticsearch is indexed and searchable by default.
Elasticsearch provides a Search Lite API and a Query DSL for searching.