I have a json file that I need to reformat - insert array around some of the nested objects.
It looks like that:
{
"test": {
"UserData": "test",
"password": "123"
},
"test2": {
"UserData": "test2",
"password": "123"
},
"test3": {
"UserData": "test3",
"password": "123"
}
}
And I need it to look like that:
{
"test": [{
"UserData": "test",
"password": "123"
}],
"test2": [{
"UserData": "test2",
"password": "123"
}],
"test3": [{
"UserData": "test3",
"password": "123"
}]
}
I'm using Ansible, jq and bash.
Given the data
d1:
test:
UserData: test
password: '123'
test2:
UserData: test2
password: '123'
test3:
UserData: test3
password: '123'
Convert the dictionaries to the first items in the lists. For example
d2_query: '[].[key, [value]]'
d2: "{{ dict(d1|dict2items|json_query(d2_query)) }}"
gives
d2:
test:
- UserData: test
password: '123'
test2:
- UserData: test2
password: '123'
test3:
- UserData: test3
password: '123'
Example of a complete playbook for testing
- hosts: localhost
vars:
d1:
test:
UserData: test
password: '123'
test2:
UserData: test2
password: '123'
test3:
UserData: test3
password: '123'
d2_query: '[].[key, [value]]'
d2: "{{ dict(d1|dict2items|json_query(d2_query)) }}"
tasks:
- debug:
var: d2
Using jq: Select each field of the object with .[], and update it |= to itself wrapped in an array [.].
jq '.[] |= [.]' file.json
{
"test": [
{
"UserData": "test",
"password": "123"
}
],
"test2": [
{
"UserData": "test2",
"password": "123"
}
],
"test3": [
{
"UserData": "test3",
"password": "123"
}
]
}
Demo
you could use this playbook (using dict2items and combine):
- name: "tips4"
hosts: localhost
vars:
json1:
test:
UserData: test
password: '123'
test2:
UserData: test2
password: '123'
test3:
UserData: test3
password: '123'
tasks:
- name: set var
set_fact:
newjson: "{{newjson|d({})|combine({item.key:[item.value]}) }}"
loop: "{{ json1|dict2items }}"
- name: disp result
debug:
msg: "{{ newjson }}"
result:
ok: [localhost] => {
"msg": {
"test": [
{
"UserData": "test",
"password": "123"
}
],
"test2": [
{
"UserData": "test2",
"password": "123"
}
],
"test3": [
{
"UserData": "test3",
"password": "123"
}
]
}
}
Related
Here is my JSON
[
{
"?xml": {
"attributes": {
"encoding": "UTF_8",
"version": "1.0"
}
}
},
{
"jdbc_data_source": [
{
"attributes": {
"xmlns": "http://xmlns.oracle.com/weblogic/jdbc_data_source"
}
},
{
"name": "canwebds"
},
{
"jdbc_driver_params": [
{
"url": "jdbc:oracle:thin:#//myhost.mrshmc.com:1521/OLTT206"
},
{
"driver_name": "oracle.jdbc.OracleDriver"
},
{
"properties": {
"property": [
{
"name": "user"
},
{
"value": "WEB_USER"
}
]
}
},
{
"password_encrypted": "{AES}BcqmURyYoCkLvC5MmREXsfpRMO93KPIubqUAbb95+nE="
}
]
},
{
"jdbc_connection_pool_params": [
{
"initial_capacity": "1"
},
{
"statement_cache_type": "LRU"
}
]
},
{
"jdbc_data_source_params": {
"jndi_name": "canwebds"
}
}
]
},
{
"?xml": {
"attributes": {
"encoding": "UTF_8",
"version": "1.0"
}
}
},
{
"jdbc_data_source": [
{
"attributes": {
"xmlns": "http://xmlns.oracle.com/weblogic/jdbc_data_source"
}
},
{
"name": "dsARSVelocity"
},
{
"jdbc_driver_params": [
{
"url": "jdbc:oracle:thin:#myhost:1521:DB01"
},
{
"driver_name": "oracle.jdbc.OracleDriver"
},
{
"properties": {
"property": [
{
"name": "user"
},
{
"value": "AP05"
}
]
}
},
{
"password_encrypted": "{AES}wP5Se+OQdR21hKiC2fDw1WPEaTMU5Sc17Ax0+rmjmPI="
}
]
},
{
"jdbc_connection_pool_params": [
{
"initial_capacity": "1"
},
{
"statement_cache_type": "LRU"
}
]
},
{
"jdbc_data_source_params": [
{
"jndi_name": "dsARSVel"
},
{
"global_transactions_protocol": "OnePhaseCommit"
}
]
}
]
}
]
I need to print the below for any jdbc_data_source found
expected output:
jdbc_data_source name is has username and jndi name <jndi_name>
which will translate as:
jdbc_data_source name is cwds has username CAN_USER and jndi name cwdsjndi
Below is something i tried but it does not work:
- name: create YML for server name with DB
debug:
msg: "{{ dsname.0.name }} has jndi {{ dsurl[0]['jdbc_driver_params'][2]['properties][0]['property'][1]['value'] }}"
loop: "{{ jsondata[1] }}"
vars:
dsname: "{{ item.jdbc_data_source| selectattr('name', 'defined') | list }}"
dsurl: "{{ item.jdbc_data_source| selectattr('jdbc_driver_params', 'defined') | list }}"
However, it does not get me the desired output. Below is the error i get:
fatal: [localhost]: FAILED! => {"msg": "Invalid data passed to 'loop',
it requires a list, got this instead: {'jdbc_data_source':
[{'attributes': {'xmlns':
'http://xmlns.oracle.com/weblogic/jdbc_data_source', 'xml
If I loop loop: "{{ jsondata }}", then it works but the desired values still do not get printed.
Kindly suggest.
this playbook does the job:
- hosts: localhost
gather_facts: no
vars:
json: "{{ lookup('file', 'file.json') | from_json }}"
tasks:
- name: display datas
debug:
msg: "jdbc_data_source name is {{ name }} has username: {{ user }} and jndi name: {{ jndiname }}"
loop: "{{ json }}"
when: item.jdbc_data_source is defined
vars:
datasource1: "{{ item.jdbc_data_source | selectattr('jdbc_driver_params', 'defined') }}"
properties: "{{ (datasource1.0.jdbc_driver_params | selectattr('properties', 'defined')).0.properties }}"
name: "{{ (item.jdbc_data_source | selectattr('jdbc_data_source_params', 'defined')).0.jdbc_data_source_params.jndi_name }}"
user: "{{ (properties.property | selectattr('value', 'defined')).0.value }}"
jndiname: "{{ (item.jdbc_data_source | selectattr('name', 'defined') ).0.name}}"
result:
skipping: [localhost] => (item={'?xml': {'attributes': {'encoding': 'UTF_8', 'version': '1.0'}}})
ok: [localhost] => "msg": "jdbc_data_source name is cwds has username: CAN_USER and jndi name: cwdsjndi"
}
skipping: [localhost] => (item={'?xml': {'attributes': {'encoding': 'UTF_8', 'version': '1.0'}}})
ok: [localhost] => "msg": "jdbc_data_source name is dsvelcw has username: WEB_USER and jndi name: dsvelcw"
if you have a mixed of list and dictionary, change:
vars:
datasource1: "{{ item.jdbc_data_source | selectattr('jdbc_driver_params', 'defined') }}"
properties: "{{ (datasource1.0.jdbc_driver_params | selectattr('properties', 'defined')).0.properties }}"
params: "{{ (item.jdbc_data_source | selectattr('jdbc_data_source_params', 'defined')).0.jdbc_data_source_params }}"
name: "{{ params.jndi_name if params is mapping else (params | selectattr('jndi_name', 'defined')).0.jndi_name }}"
user: "{{ (properties.property | selectattr('value', 'defined')).0.value }}"
jndiname: "{{ (item.jdbc_data_source | selectattr('name', 'defined') ).0.name}}"
i test if its a dict else its a list..
result with new json:
skipping: [localhost] => (item={'?xml': {'attributes': {'encoding': 'UTF_8', 'version': '1.0'}}})
"msg": "jdbc_data_source name is canwebds has username: WEB_USER and jndi name: canwebds"
}
skipping: [localhost] => (item={'?xml': {'attributes': {'encoding': 'UTF_8', 'version': '1.0'}}})
"msg": "jdbc_data_source name is dsARSVel has username: AP05 and jndi name: dsARSVelocity"
}
some explanations:
(item.jdbc_data_source | selectattr('jdbc_data_source_params', 'defined')) selectattr create a list with all keys jdbc_data_source_params present in the list jdbc_data_source, here there are only one key jdbc_data_source_params, so the 0.jdbc_data_source_params selects the first key.
then we check if params is a dict else its a list
if you want to understand, i suggest you to decompose in lot of actions and display with debug the result.
FYI selectattr equivalent with json_query
selectattr('something', 'defined') = json_query("[?something]")
I am trying to loop over a following variable structure using ansible and what I want is to loop over each subnet.
For subnets-1 I want to get network, cidr, vlan
similarly loop other keys subnets-2, subnets-3
flavor.json
{
"subnets-1": [
{
"network": "test1",
"cidr": 21,
"vlan": 123
},
{
"network": "test2",
"cidr": 22,
"vlan": 234
}
],
"subnets-2": [
{
"network": "test3",
"cidr": 43,
"vlan": 879
},
{
"network": "test4",
"cidr": 21,
"vlan": "12fsd"
},
{
"network": "test5",
"cidr": "22sdf",
"vlan": "234sdfd"
}
],
"subnets-3": [
{
"network": "test44",
"cidr": "fg",
"vlan": "dsfsd"
}
]
}
I have tried something like below using with_dict
playbook.yml (ansible 2.9.7)
---
- hosts: local
gather_facts: no
tasks:
- name: lookup yaml
set_fact:
flavors: "{{ lookup('file','flavor.json') | from_json }}"
- name: json out
debug:
msg: "{{ flavors }}"
- name: check keys
shell: |
echo "s_name = {{ item.network }}"
echo "s_cidr = {{ item.cidr }}"
echo "s_vlan = {{ item.vlan }}"
with_dict:
"{{ flavors }}"
This is the error I get when playbook was executed
Error:
fatal: [192.168.110.128]: FAILED! => {
"msg": "The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'network'\n\nThe error appears to be in '/root/fla.yml': line 19, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - name: check keys\n ^ here\n"
}
Use include_vars. It's simpler compared to lookup and from_yaml, e.g.
- include_vars:
file: flavor.json
name: flavors
gives
flavors:
subnets-1:
- {cidr: 21, network: test1, vlan: 123}
- {cidr: 22, network: test2, vlan: 234}
subnets-2:
- {cidr: 43, network: test3, vlan: 879}
- {cidr: 21, network: test4, vlan: 12fsd}
- {cidr: 22sdf, network: test5, vlan: 234sdfd}
subnets-3:
- {cidr: fg, network: test44, vlan: dsfsd}
Then convert the dictionary to list and iterate with_subelements, e.g.
- debug:
msg: "{{ item.0.key }}
{{ item.1.network }}
{{ item.1.cidr }}
{{ item.1.vlan }}"
with_subelements:
- "{{ flavors|dict2items }}"
- value
gives
msg: subnets-1 test1 21 123
msg: subnets-1 test2 22 234
msg: subnets-2 test3 43 879
msg: subnets-2 test4 21 12fsd
msg: subnets-2 test5 22sdf 234sdfd
msg: subnets-3 test44 fg dsfsd
Ansible does not do nested loops well. In this case, you know you have a maximum of three array elements. So you can loop like this:
- name: loop over flavors
debug:
msg: "{{ flavors[item.0][item.1] }}"
with_nested:
- "{{ flavors }}"
- [ 0, 1, 2 ]
when: flavors[item.0][item.1] is defined
Which results in something like this:
TASK [loop over flavors] *********************************************************************************************
ok: [localhost] => (item=['subnets-1', 0]) => {
"msg": {
"cidr": 21,
"network": "test1",
"vlan": 123
}
}
ok: [localhost] => (item=['subnets-1', 1]) => {
"msg": {
"cidr": 22,
"network": "test2",
"vlan": 234
}
}
skipping: [localhost] => (item=['subnets-1', 2])
ok: [localhost] => (item=['subnets-2', 0]) => {
"msg": {
"cidr": 43,
"network": "test3",
"vlan": 879
}
}
ok: [localhost] => (item=['subnets-2', 1]) => {
"msg": {
"cidr": 21,
"network": "test4",
"vlan": "12fsd"
}
}
ok: [localhost] => (item=['subnets-2', 2]) => {
"msg": {
"cidr": "22sdf",
"network": "test5",
"vlan": "234sdfd"
}
}
ok: [localhost] => (item=['subnets-3', 0]) => {
"msg": {
"cidr": "fg",
"network": "test44",
"vlan": "dsfsd"
}
}
skipping: [localhost] => (item=['subnets-3', 1])
skipping: [localhost] => (item=['subnets-3', 2])
In my react app, I am converting a csv file to a JSON object so that I can show it in a table in the UI. But I want to group the JSON data based on a certain field, "IP Address". All the area should be clubbed to a "AreaList" field which is an array of original "area" field.
Here is my source code where I am parsing the csv to json:
React.useEffect(() => {
Papa.parse(csvFile, {
download: true,
header: true,
skipEmptyLines: true,
complete: data => {
console.log(data.data);
}
});
}, []);
Following is the console log output of the above code:
[
{
IP Address: "192.168.0.1:61000",
area: "JA1_JA2", job: "test",
flow: "PartsServe",
Component: "1",
…
},
{
IP Address: "192.168.0.1:61000",
area: "JA1_JA2",
job: "test1",
flow: "PartsServe",
Component: "1",
…
},
{
IP Address: "192.168.0.1:63000",
area: "JA1_JA2",
job: "test",
flow: "PartsServe",
Component: "1",
…
},
{
IP Address: "192.168.0.1:63000",
area: "JA1_JA3",
job: "test2",
flow: "PartsServe",
Component: "1",
…
}
]
I want to group the json based on IP Address field. The resultant json should look like this:
projects: [
{
IP Address: "192.168.0.1:61000",
AreaList: [
{
area: "JA1_JA2",
jobList: [
{job: "test", flow: "PartsServe", Component: "1"},
{job: "test1", flow: "PartsServe", Component: "1"}
]
},
]
},
{
IP Address: "192.168.0.1:63000",
AreaList: [
{
area: "JA1_JA2",
jobList: [
{job: "test", flow: "PartsServe", Component: "1"},
{job: "test1", flow: "PartsServe", Component: "1"}
]
},
{
area: "JA1_JA3",
jobList: [
{job: "test", flow: "PartsServe", Component: "1"},
{job: "test2", flow: "PartsServe", Component: "1"}
]
},
]
}
]
I could form the logic for you. May not be the optimised way but I did it in hurry.
const data = [
{
"IP Address": "192.168.0.1:61000",
area: "JA1_JA2", job: "test",
flow: "PartsServe",
Component: "1",
},
{
"IP Address": "192.168.0.1:61000",
area: "JA1_JA2",
job: "test1",
flow: "PartsServe",
Component: "1",
},
{
"IP Address": "192.168.0.1:63000",
area: "JA1_JA2",
job: "test",
flow: "PartsServe",
Component: "1",
},
{
"IP Address": "192.168.0.1:63000",
area: "JA1_JA3",
job: "test2",
flow: "PartsServe",
Component: "1",
}
];
const uniqueIPs = Array.from(new Set(data.map((item) => item["IP Address"])));
const areaList = uniqueIPs.map((ip) => {
const currentIpData = data.filter((item) => item["IP Address"] === ip);
const uniqueAreas = Array.from(new Set(currentIpData.map((item) => item.area)));
return ({
"IP Address": ip,
AreaList: currentIpData.map((area) => {
const currentJobData = currentIpData.filter((item) => item.area === area.area);
const uniqueJobs = Array.from(new Set(currentJobData.map((item) => item.job)));
return ({
area: area.area,
jobList: currentJobData.map((item) => ({
job: item.job,
flow: item.flow,
Component: item.Component
}))
})
})
})
})
console.log(areaList);
Here i need my explain my problem clearly. How can i write a query to retrieve every subdocument array who matches the condition using mongoose and nodejs.
this is my existing JSON:
[
{
_id: "568ccd6e646489f4106470ec",
area_name: "Padi",
warehouse_name: "Hapserve Online Water Service",
name: "Ganesh",
email: "ganesh#excrin.com",
mobile_no: "9042391491",
otp: "4466",
__v: 0,
date: "06-01-2016",
booking:
[
{
can_quantity: "4",
delivery_date: "06-01-2016",
delivery_timeslot: "10am-3pm",
subscription: "true",
subscription_type: "Weekly",
total_cost: "240",
order_id: "S13833",
can_name: "Tata Waters",
address: "15/A,Ramanrajan street,,Padi,Chennai",
can_cost: "240",
_id: "568ccd6e646489f4106470ee",
ordered_at: "2016-01-06T08:16:46.825Z",
status: "UnderProcess"
},
{
can_name: "Bisleri",
can_quantity: "4",
can_cost: "200",
delivery_date: "11-01-2016",
delivery_timeslot: "3pm-8pm",
order_id: "11537",
address: "27,Main Street,Padi,Chennai",
_id: "5693860edb988e241102d196",
ordered_at: "2016-01-11T10:38:06.749Z",
status: "UnderProcess"
}
]
},
{
_id: "56937fb8920629a0164604d8",
area_name: "Poonamallee",
warehouse_name: "Tata Waters",
name: "M.Kalaiselvan",
email: "kalai131192#gmail.com",
mobile_no: "9003321521",
otp: "2256",
__v: 0,
date: "2016-01-11T10:11:04.266Z",
booking:
[
{
can_quantity: "4",
delivery_date: "06-01-2016",
delivery_timeslot: "10am-3pm",
subscription: "true",
subscription_type: "Alternate",
total_cost: "640",
order_id: "S13406",
can_name: "Kinley",
address: "133,Bajanai koil street, Melmanagar,Poonamallee,Chennai",
can_cost: "160",
_id: "56937fb8920629a0164604da",
ordered_at: "11-01-2016",
status: "UnderProcess"
},
{
can_name: "Tata Waters",
can_quantity: "2",
can_cost: "120",
delivery_date: "11-01-2016",
delivery_timeslot: "10am-3pm",
order_id: "11387",
address: "140,Bajanai koil street, Melmanagar,Poonamallee,Chennai",
_id: "56937ff7920629a0164604dc",
ordered_at: "2016-01-11T10:12:07.719Z",
status: "UnderProcess"
},
{
can_name: "Bisleri",
can_quantity: "4",
can_cost: "200",
delivery_date: "12-01-2016",
delivery_timeslot: "10am-3pm",
order_id: "16853",
address: "140,Bajanai koil street, Melmanagar,Poonamallee,Chennai",
_id: "56938584db988e241102d194",
ordered_at: "2016-01-11T10:35:48.911Z",
status: "UnderProcess"
},
{
can_name: "Hapserve",
can_quantity: "6",
can_cost: "150",
delivery_date: "11-01-2016",
delivery_timeslot: "10am-3pm",
order_id: "17397",
address: "133,Bajanai koil street, Melmanagar,Poonamallee,Chennai",
_id: "569385bbdb988e241102d195",
ordered_at: "2016-01-11T10:36:43.918Z",
status: "UnderProcess"
},
{
can_name: "Bisleri",
can_quantity: "5",
can_cost: "250",
delivery_date: "11-01-2016",
delivery_timeslot: "10am-3pm",
order_id: "14218",
address: "133,Bajanai koil street, Melmanagar,Poonamallee,Chennai",
_id: "56939a13c898ef7c0cc882b0",
ordered_at: "2016-01-11T12:03:31.324Z",
status: "Cancelled"
}
]
}
]
Here i need to retrieve every document where delivery date is today
so this is my nodejs route
router.get('/booking-date/:date', function(req, res){
var id = req.params.date;
RegisterList.find({'booking.delivery_date':id}, {'booking.$':1}, function(err, docs){
if(err)
throw err;
res.json(docs);
});
});
while am using this am not able to get every data. only two data is retrieve from collection.
example if i search for a date 11-01-2016 am getting only one subdocument for
each parent id, but in the above json for date 11-01-2016. for one parent id has 2 subdocument for that date and another parent id has 1 subdocument for that date.
am not able to write mongoose query retrieve to every subdocument where matches done..
Help will be appreciated...
Sounds like you may want to try the aggregation framework where you can $project the booking array with a filter made possible using the $setDifference, $map and $cond operators.
The $map operator inspects each element within the booking array and the $cond operator returns only the wanted fields based on a true condtion, a false value is returned on the contrary instead of the array element. $setDifference operator then removes all false values from the array by comparing to another set with the [false] values, the final result is only the returned matches:
router.get('/booking-date/:date', function(req, res){
var id = req.params.date,
pipeline = [
{
"$match": { 'booking.delivery_date': id }
},
{
"$project": {
"booking": {
"$setDifference": [
{
"$map": {
"input": "$booking",
"as": "el",
"in": {
"$cond": [
{ "$eq": [ "$$el.delivery_date", id ] },
"$$el",
false
]
}
}
},
[false]
]
}
}
}
];
RegisterList.aggregate(pipeline, function(err, docs){
if(err) throw err;
res.json(docs);
});
});
The $ projection operator projects only first matching element, refer here
Project all subdocuments {bookings: 1},then filter subdocuments within your application.
Similar to this question, I can't get my DataView to actually show data. I tried to restructure my store, but I think I'm missing something. Here's what I've got so far:
App, Model, Store
Ext.regApplication({
name: 'TestApp',
launch: function() {
this.viewport = new TestApp.views.Viewport();
}
});
TestApp.models.StoreMe = Ext.regModel('TestApp.models.StoreMe', {
fields: [
'id',
'name',
'age'
]
});
TestApp.stores.storeMe = new Ext.data.Store({
model: 'TestApp.models.StoreMe',
proxy: {
type: 'ajax',
url: 'data.json',
reader: {
type: 'json'
}
},
autoLoad: true
});
Viewport and DataView
TestApp.views.Viewport = Ext.extend(Ext.Panel, {
fullscreen: true,
layout: 'card',
items: [
{
id: 'dataView',
xtype: 'dataview',
store: TestApp.stores.storeMe,
itemSelector: 'div.dataViewItem',
emptyText: 'NO DATA',
tpl: '<tpl for "."><div class="dataViewItem">ID: {id}<br />Name: {name}<br />Age: {age}</div></tpl>'
}
]
});
JSON
[
{
"id": "1",
"name": "sam",
"age": "4"
},
{
"id": "2",
"name": "jack",
"age": "3"
},
{
"id": "3",
"name": "danny",
"age": "12"
}
]
Any ideas? All of the other questions that are similar to this use Ext.JsonStore, but the Sencha API docs say to do it this way.
UPDATE
The store is working fine. Here's what TestApp.stores.storeMe.data looks like:
Ext.util.MixedCollection
...
items: Array[3]
0: c
data: Object
age: "4"
id: "1"
name: "sam"
1: c
2: c
length: 3
__proto__: Array[0]
keys: Array[3]
length: 3
...
Seems you don't have the json structure with the root called "data"? Try the change your json to:
{
"data": [ {
"id": "1",
"name": "sam",
"age": "4"
}, {
"id": "2",
"name": "jack",
"age": "3"
}, {
"id": "3",
"name": "danny",
"age": "12"
} ]
}
And put a line -- root: 'data' -- in your reader.
I'm an idiot. I had:
tpl: '<tpl for "."><div class="dataViewItem">ID: {id}<br />Name: {name}<br />Age: {age}</div></tpl>'
I needed:
tpl: '<tpl for=".">...</tpl>'