Combining JSON items using JMESPath and/or Ansible - json

I have an Ansible playbook that queries a device inventory API and gets back a JSON result that contains a lot of records following this format:
{
"service_level": "Test",
"tags": [
"Application:MyApp1"
],
"fqdn": "matestsvcapp1.vipcustomers.com",
"ip": "172.20.11.237",
"name": "matestsvcapp1.vipcustomers.com"
}
I then loop through these ansible tasks to query the JSON result for each of the IP addresses I care about:
- name: Set JMESQuery
set_fact:
jmesquery: "Devices[?ip_addresses[?ip.contains(#,'{{ ip_to_query }}' )]].{ip: '{{ ip_to_query }}', tags: tags[], , service_level: service_level }"
- name: Store values
set_fact:
inven_results: "{{ (inven_results| default([])) + (existing_device_info.json | to_json | from_json | json_query(jmesquery)) }}"
I then go on to do other tasks in ansible, pushing this data into other systems, and everything works fine.
However, I just got a request from management that they would like to see the 'service level' represented as a tag in some of the systems I push this data into. Therefore I need to combine the 'tags' and 'service_level' items resulting in something that looks like this:
{
"tags": [
"Application:MyApp1",
"service_level:Test"
],
"fqdn": "matestsvcapp1.vipcustomers.com",
"ip": "172.20.11.237",
"name": "matestsvcapp1.vipcustomers.com"
}
I've tried modifying the JMESPath query to join the results together using the join function, and tried doing it the 'ansible' way, using the combine or map, but I couldn't get either of those to work either.
Any thoughts on the correct way to handle this? Thanks in advance!
Note: 'tags' is a list of strings, and even though it's written in key:value format, it's really just a string.

to add two arrays you use the + operator like this:
ansible localhost -m debug -a 'msg="{{ b + ["String3"] }}"' -e '{"b":["String1", "String2"]}'
result:
localhost | SUCCESS => {
"msg": [
"String1",
"String2",
"String3"
]
}
So if i take your json code as test.json you could run
ansible localhost -m debug -a 'msg="{{ tags + ["service_level:" ~ service_level ] }}"' -e #test.json
Result:
localhost | SUCCESS => {
"msg": [
"Application:MyApp1",
"service_level:Test"
]
}
With this knowledge you can use set_fact to put this new array in a variable for later use.

Related

jq query to find nested value and return parent values

having trouble finding this, maybe it's just my search terms or who knows.
basically, i have a series of arrays mapping keyspaces to destination DBs for a large noSQL migration, in order for us to more easily script data movement. i'll include sample JSON below.
it's nested basically like: environment >> { [ target DB ] >> [ list of keyspaces ] }, { [ target DB ] >> [ list of keyspaces ] }
my intent was to update my migration script to more intelligently determine where things go based on which environment is specified, etc and require less user input or "figuring things out".
here's sample JSON:
{
"Prod": [
{
"prod1": [
"prod_db1",
"prod_db2",
"prod_d31",
"prod_db4"
]
},
{
"prod2": [
"prod_db5",
"prod_db6",
"prod_db7",
"prod_db8"
]
}
]
}
assuming i'm able to provide keyspace and environment to the script, and use those as variables in my jq query, is there a way to search for the keyspace and return the value for one level up? IE, i know i can do something like:
!#/bin/bash
ENV="Prod"
jq '.."${ENV}"[][]' env.json
to just get the DBs in the prod environment. but if i'm searching for prod_db6' how can i return the value prod2`?
Use to_entries to decompose an object into an array of key-value pairs, then IN to search in the value's array, and finally return the key:
jq -r --arg env "Prod" --arg ksp "prod_db6" '
.[$env][] | to_entries[] | select(IN(.value[]; $ksp)).key
' env.json
prod2
Demo

How to extract values from MySQL query in Ansible play

In an Ansible play, I'm running a successful SQL query on a MySQL database which returns:
"result": [
{
"account_profile": "sbx"
},
{
"account_profile": "dev"
}
]
That result is saved into a variable called query_output. I know that I can display the results array in Ansible via
- debug:
var: query_output.result
But for the life of me I cannot figure out how to extract the 2 account_profile values.
My end goal is to extract them into a fact which is an array. Something like:
"aws_account_profiles": [ "sbx", "dev" ]
I know that I'm missing something really obvious.
Suggestions?
The thing you want is the map filter's attribute= usage:
{{ query_output.result | map(attribute="account_profile") | list }}

Ansible win_shell with JSON to powershell, works fine other way

I can not get how to get win_shell to send JSON to a powershell script on a windows server. If I run this:
vm:
annotation: This is a VM
cdrom:
type: client
iso_path: ''
controller_type: ''
state: ''
cluster: ''
- name: "Run script '.ps1'"
win_shell: D:\scripts\outputValues.ps1 -vm "{{vm}}"
register: result
- name: Process win_shell output
set_fact:
fullResult: "{{ result.stdout | from_json }}"
I get this string into powershell that I can not even convert TO JSON:
{u'cdrom': {u'controller_type': u'', u'state': u'', u'type': u'client', u'iso_path': u''}, u'cluster': u'', u'annotation': u'This is a VM'}
If I run the script with this:
win_shell: D:\scripts\outputValues.ps1 -vm "{{vm | to_json}}"
All I get is an open curly bracket '{' into powershell.
It does work the other way. As a test, if I call that same powershell script from win_shell and ignore the input in powershell and simply create an object like this and send it back to ansible, it works fine. Why can I send JSON one way and not the other? I have tried vm|to_json etc.
#Powershell
$vmAttribs = #{"type" = "client"; "iso_path" = ""; "controller_type" =""; "state" =""}
$vmObject = New-Object psobject -Property #{
annotation = 'This is a VM'
cdrom = $vmAttribs
cluster = ""
}
$result = $vmObject | ConvertTo-Json -Depth 8
Write-Output $result
ansible gets:
{
"msg": {
"cdrom": {
"controller_type": "",
"state": "",
"type": "client",
"iso_path": ""
},
"cluster": "",
"annotation": "This is a VM"
},
"changed": false,
"_ansible_verbose_always": true,
"_ansible_no_log": false
}
What you're seeing is a Python structure signifying JSON notation with Unicode strings (e.g. u'my string value'). Obviously, that's not exactly portable to other runtimes like Powershell, and results in an invalid string.
Reading up on Ansible filters, you will want to use {{ vm | to_json }} to ensure that the data structure is stringified to JSON correctly.
Workaround
I want to stress that this is not the ideal way of getting a JSON object out of Ansible, and a proper solution that doesn't involve hacking out the Unicode identifier from the JSON string is desirable to this.
I don't know enough about Ansible to know why the string becomes { when piping vm to to_json, but here's a workaround that might get you going until someone else can chime in with what is going on with Ansible's filtering here.
In your .ps1 script, you can use the following -replace to remove those u characters before the strings:
$vm = $vm -replace "u(?=([`"']).*\1)"
You can test this by running the following in an interactive terminal:
"{u'cdrom': {u'controller_type': u'', u'state': u'', u'type': u'client', u'iso_path': u''}, u'cluster': u'', u'annotation': u'This is a VM'}" -replace "u(?=([`"']).*\1)"
The pattern matches on the u character that comes immediately before a single-quote or double-quote followed by any number of characters (including none) followed by another single-quote or double quote, whichever was matched the first time. -replace does not require a second argument if you are replacing the pattern with an empty string.

Extracting volume_id from a ec2 creation register

I need to extract the EBS volume ids from the register return from an EC2 creation call. I've already got it down to a chuck which holds the data I want, but the last step eludes me.
I've tried to do it with:
- set_fact:
volume_id_list: "{{ devices | json_query('[*].volume_id') }}"
- debug: var=volume_id_list
And it returns an empty string.
"devices": {
"/dev/sdf": {
"delete_on_termination": true,
"status": "attached",
"volume_id": "vol-0b2c92cdcblah"
},
"/dev/xvda": {
"delete_on_termination": true,
"status": "attached",
"volume_id": "vol-086a722c4blah"
}
}
What I wanted to see was something like:
"vol-0b2c92cdcblah"
"vol-086a722c4blah"
Your jmespath expression in json_query does not match anything in your data-structure. So the empty string is a totally correct result :)
Now, to get what your want from your current data-structure, you need to change your query: json_query('*.volume_id')

Retrieve one (last) value from influxdb

I'm trying to retrieve the last value inserted into a table in influxdb. What I need to do is then post it to another system via HTTP.
I'd like to do all this in a bash script, but I'm open to Python also.
$ curl -sG 'https://influx.server:8086/query' --data-urlencode "db=iotaWatt" --data-urlencode "q=SELECT LAST(\"value\") FROM \"grid\" ORDER BY time DESC" | jq -r
{
"results": [
{
"statement_id": 0,
"series": [
{
"name": "grid",
"columns": [
"time",
"last"
],
"values": [
[
"2018-01-17T04:15:30Z",
690.1
]
]
}
]
}
]
}
What I'm struggling with is getting this value into a clean format I can use. I don't really want to use sed, and I've tried jq but it complains the data is a string and not an index:
jq: error (at <stdin>:1): Cannot index array with string "series"
Anyone have a good suggestion?
Pipe that curl to the jq below
$ your_curl_stuff_here | jq '.results[].series[]|.name,.values[0][]'
"grid"
"2018-01-17T04:15:30Z"
690.1
The results could be stored into a bash array and used later.
$ results=( $(your_curl_stuff_here | jq '.results[].series[]|.name,.values[0][]') )
$ echo "${results[#]}"
"grid" "2018-01-17T04:15:30Z" 690.1
# Individual values could be accessed using "${results[0]}" and so, mind quotes
All good :-)
Given the JSON shown, the jq query:
.results[].series[].values[]
produces:
[
"2018-01-17T04:15:30Z",
690.1
]
This seems to be the output you want, but from the point of view of someone who is not familiar with influxdb, the requirements seem very opaque, so you might want to consider a variant, such as:
.results[-1].series[-1].values[-1]
which in this case produces the same result, as it happens.
If you just want the atomic values, you could simply append [] to either of the queries above.