How to populate value with input - json

From a composer.lock I am creating json output of some fields, like this:
<composer.lock jq '.packages[] | {site: "foo", name: .name, version: .version, type: .type}' | jq -s
It works fine:
{
"site": "foo",
"name": "webmozart/path-util",
"version": "2.3.0",
"type": "library"
},
{
"site": "foo",
"name": "webonyx/graphql-php",
"version": "v14.9.0",
"type": "library"
}
but now I want to add a generated value, using uuidgen, but I can't figure out how to do that.
End result would be something like:
{
"site": "foo",
"name": "webmozart/path-util",
"version": "2.3.0",
"type": "library",
"uuid": "c4e97c3c-147d-4360-a601-6b4f6f5e71bb"
},
{
"site": "foo",
"name": "webonyx/graphql-php",
"version": "v14.9.0",
"type": "library",
"uuid": "6fbe472b-49fe-4064-93f0-09a18a7e1c24"
}
I think I should use input, but all I tried so far have failed.

If your shell supports process substitution, you can avoid a two-pass solution as follows:
jq -nR --argfile input composer.lock '
$input
| .packages[]
| {site: "foo", name: .name, version: .version, type: .type, id: input}
' < <(while true; do uuidgen; done)
The point here is that by using process substitution as above, the termination of the "consumer" (jq) will also terminate the "producer" of UUIDs.

Here's one possibility that does not require your shell to support "process substitution" but which entails two passes, the first for determining the number of UUIDs that are required:
# Syntax: generate N
function generate {
for ((i=0; i<$1; i++)) ; do
uuidgen
done
}
generate $( <composer.lock jq '.packages|length') |
jq -nR --argfile input composer.lock '
($input
| .packages[]
| {site: "foo", name: .name, version: .version, type: .type,
id: input})'

Related

Remove last character from json output using JQ

I have a json that looks like this:
{
"HostedZones": [
{
"ResourceRecordSetCount": 2,
"CallerReference": "test20150527-2",
"Config": {
"Comment": "test2",
"PrivateZone": true
},
"Id": "/hostedzone/Z119WBBTVP5WFX",
"Name": "dev.devx.company.services."
},
{
"ResourceRecordSetCount": 2,
"CallerReference": "test20150527-1",
"Config": {
"Comment": "test",
"PrivateZone": true
},
"Id": "/hostedzone/Z3P5QSUBK4POTI",
"Name": "test.devx.company.services."
}
],
"IsTruncated": false,
"MaxItems": "100"
}
And my goal is to fetch a specific Name (in my case it's the test.devx.company.services), however the Name field contains an extra "." at the end that I'd like to remove from the output.
This is what I have so far:
jq --raw-output '.HostedZones[] | select(.Name | test("test")?) | (.Name[:-1] | sub("."; ""))'
The problem with that it is removing the first character from the output also.
So the output currently is: est.devx.company.services (JQ play snippet)
Not sure what I'm doing wrong :/
To always remove the last character, if it contains "test":
jq '(.HostedZones[].Name | select(contains("test"))) |= .[:-1]'
To remove it only if it is a dot:
jq '(.HostedZones[].Name | select(contains("test"))) |= sub("[.]$"; "")'

Combine files in jq based on similar ID object and reform data

Preface: If the following is not possible with jq, then I completely accept that as an answer and will try to force this with bash.
I have two files that contain some IDs that, with some massaging, should be able to be combined into a single file. I have some content that I'll add to that as well (as seen in output). Essentially "mitre_test" should get compared to "sys_id". When compared, the "mitreid" from in2.json becomes technique_ID in the output (and is generally the unifying field of each output object).
Caveats:
There are some junk "desc" values placed in the in1.json that are there to make sure this is as programmatic as possible, and there are actually numerous junk inputs on the true input file I am using.
some of the mitre_test values have pairs and are not in a real array. I can split on those and break them out, but find myself losing the other information from in1.json.
Notice in the "metadata" for the output that is contains the "number" values from in1.json, and stored in a weird way (but the way that the receiving tool requires).
in1.json
[
{
"test": "Execution",
"mitreid": "T1204.001",
"mitre_test": "90b"
},
{
"test": "Defense Evasion",
"mitreid": "T1070.001",
"mitre_test": "afa"
},
{
"test": "Credential Access",
"mitreid": "T1556.004",
"mitre_test": "14b"
},
{
"test": "Initial Access",
"mitreid": "T1200",
"mitre_test": "f22"
},
{
"test": "Impact",
"mitreid": "T1489",
"mitre_test": "fa2"
}
]
in2.json
[
{
"number": "REL0001346",
"desc": "apple",
"mitre_test": "afa"
},
{
"number": "REL0001343",
"desc": "pear",
"mitre_test": "90b"
},
{
"number": "REL0001366",
"desc": "orange",
"mitre_test": "14b,f22"
},
{
"number": "REL0001378",
"desc": "pineapple",
"mitre_test": "90b"
}
]
The output:
[{
"techniqueID": "T1070.001",
"tactic": "defense-evasion",
"score": 1,
"color": "",
"comment": "",
"enabled": true,
"metadata": [{
"name": "DET_ID",
"value": "REL0001346"
}],
"showSubtechniques": true
},
{
"techniqueID": "T1204.001",
"tactic": "execution",
"score": 1,
"color": "",
"comment": "",
"enabled": true,
"metadata": [{
"name": "DET_ID",
"value": "REL0001343"
},
{
"name": "DET_ID",
"value": "REL0001378"
}],
"showSubtechniques": true
},
{
"techniqueID": "T1556.004",
"tactic": "credential-access",
"score": 1,
"color": "",
"comment": "",
"enabled": true,
"metadata": [{
"name": "DET_ID",
"value": "REL0001366"
}],
"showSubtechniques": true
},
{
"techniqueID": "T1200",
"tactic": "initial-access",
"score": 1,
"color": "",
"comment": "",
"enabled": true,
"metadata": [{
"name": "DET_ID",
"value": "REL0001366"
}],
"showSubtechniques": true
}
]
I'm assuming I have some splitting to do on mitre_test with something like .mitre_test |= split(",")), and there are some joins I'm assuming, but doing so causes data loss or mixing up of the data. You'll notice the static data in the output exists as well, but is likely easy to place in and as such isn't as much of an issue.
Edit: reduced some of the match IDs so that it is easier to look at while analyzing the in1 and in2 files. Also simplified the two inputs to have a similar structure so that the answer is easier to understand later.
The requirements are somewhat opaque but it's fairly clear that if the task can be done by computer, it can be done using jq.
From the description, it would appear that one of the unusual aspects of the problem is that the "dictionary" defined by in1.json must be derived by splitting the key names that are CSV (comma-separated values). Here therefore is a jq def that will do that:
# Input: a JSON dictionary for which some keys are CSV,
# Output: a JSON dictionary with the CSV keys split on the commas
def refine:
. as $in
| reduce keys_unsorted[] as $k ({};
if ($k|index(","))
then ($k/",") as $keys
| . + ($keys | map( {(.): $in[$k]}) | add)
else .[$k] = $in[$k]
end );
You can see how this works by running:
INDEX($mitre.records[]; .mitre_test) | refine
using an invocation of jq such as:
jq --argfile mitre in1.json -f program.jq in2.json
For the joining part of the problem, there are many relevant Q&As on SO, e.g.
How to join JSON objects on particular fields using jq?
There is probably a much more elegant way to do this, but I ended up manually walking around things and piping to new output.
Explanation:
Read in both files, pull the fields I need.
Break out the mitre_test values that were previously just a comma separated set of values with map and try.
Store the none-changing fields as a variable and then manipulate mitre_test to become an appropriately split array, removing nulls.
Group by mitre_test values, since they are the common thing that the output is based on.
Cleanup more nulls.
Sort output to look like I want it.
jq . in1.json in2.json | \
jq '.[] |{number: .number, test: .test, mitreid: .mitreid, mitre_test: .mitre_test}' |\
jq -s '[. |map(try(.mitre_test |= split(",")) // .)|\
.[] | [.number,.test,.mitreid] as $h | .mitre_test[] |$h + [.] | \
{DET_ID: .[0], tactic: .[1], techniqueID: .[2], mitre_test: .[3]}] |\
del(.[][] | nulls)' |jq '[group_by(.mitre_test)[]|{mitre_test: .[0].mitre_test, techniqueID: [.[].techniqueID],tactic: [.[].tactic], DET_ID: [.[].DET_ID]}]|\
del(.[].techniqueID[] | nulls) | del(.[].tactic[] | nulls) | del(.[].DET_ID[] | nulls)' | \
jq '.[]| [{techniqueID: .techniqueID[0],tactic: .tactic[0], metadata: [{name: "DET_ID",value: .DET_ID[]}]}] | .[] | \
select((.metadata|length)>0)'
It was a long line, so I split it among some of the basic ideas.

select specific values from json & convert into bash variables

I'm new to jq, I have the following JSON & I need to extract FOUR values i.e. SAUCE_KEY, sauceKey, SAUCE_VALUE, sauceValue etc. And I need to covert these bash variables as i.e.
SAUCE_KEY=sauceKey
SAUCE_VALUE=sauceValue
If I will echo it, it should print it's value ie. echo $SAUCE_KEY
I have used the code as:
jq -r '.env_vars[] | with_entries(select([.key] | inside([".env_vars[].name", ".env_vars[].value"])))' | jq -r "to_entries|map(\"\(.value)=\(.value|tostring)\")|.[]"
By doing so, i was able to get values as
name=SAUCE_KEY, value=sauceKey and so on.
{
"#type": "env_vars",
"env_vars": [
{
"#type": "env_var",
"#href": "/repo/xxxxx/env_var/xxxxx",
"#representation": "standard",
"#permissions": {
"read": true,
"write": true
},
"name": "SAUCE_KEY",
"value": "sauceKey"
},
{
"#type": "env_var",
"#href": "/repo/xxxxx/env_var/xxxxx",
"#representation": "standard",
"#permissions": {
"read": true,
"write": true
},
"name": "SAUCE_VALUE",
"value": "sauceValue"
}
]
}
If instead of trying to extract variable names from the JSON, you populate an associate array with the names as keys, it's pretty straightforward.
#!/usr/bin/env bash
declare -A sauces
while IFS=$'\001' read -r name value; do
sauces[$name]=$value
done < <(jq -r '.env_vars[] | "\(.name)\u0001\(.value)"' saucy.json)
for name in "${!sauces[#]}"; do
echo "$name is ${sauces[$name]}"
done
prints out
SAUCE_VALUE is sauceValue
SAUCE_KEY is sauceKey

jq find the max in quoted values

Here is my JSON test.jsonfile :
[
{
"name": "nodejs",
"version": "0.1.21",
"apiVersion": "v1"
},
{
"name": "nodejs",
"version": "0.1.20",
"apiVersion": "v1"
},
{
"name": "nodejs",
"version": "0.1.11",
"apiVersion": "v1"
},
{
"name": "nodejs",
"version": "0.1.9",
"apiVersion": "v1"
},
{
"name": "nodejs",
"version": "0.1.8",
"apiVersion": "v1"
}
]
When I use max_by, jq return 0.1.9 instead of 0.1.21 probably due to the quoted value :
cat test.json | jq 'max_by(.version)'
{
"name": "nodejs",
"version": "0.1.9",
"apiVersion": "v1"
}
How can I get the element with version=0.1.21 ?
Semantic version compare is not supported out of the box in jq. You need to play around with the fields split by .
jq 'sort_by(.version | split(".") | map(tonumber))[-1]'
The split(".") takes the string from .version and creates an array of fields i.e. 0.1.21 becomes an array of [ "0", "1", "21"] and map(tonumber) takes an input array and transforms the string elements to an array of digits.
The sort_by() function does a index wise comparison for each of the elements in the array generated from last step and sorts in the ascending order with the object containing the version 0.1.21 at the last. The notation [-1] is to get the last object from this sorted array.
Here's an adaptation of the more general answer using jq at
How to sort Artifactory package search result by version number with JFrog CLI?
def parse:
[splits("[-.]")]
| map(tonumber? // .) ;
max_by(.version|parse)
As a less robust one-liner:
max_by(.version | [splits("[.]")] | map(tonumber))

Transform json to ini files using jq + bash

I'm trying to transform json (array of objects) to ini files:
[
{
"connection": {
"id": "br0",
"uuid": "ab1dd903-4786-4c7e-a4b4-3339b144d6c7",
"stable-id": "",
"type": "bridge",
"interface-name": "br0",
"autoconnect": "no",
"autoconnect-priority": "0",
"autoconnect-retries": "-1",
"auth-retries": "-1",
"timestamp": "44444",
"read-only": "no",
"permissions": "",
"zone": "WAN",
"master": "",
"slave-type": "",
"autoconnect-slaves": "1",
"secondaries": "",
"gateway-ping-timeout": "0",
"metered": "unknown",
"lldp": "default"
},
"ipv4": {
"method": "manual",
"dns": "192.168.1.1,192.168.2.1",
"dns-search": "",
"dns-options": " ",
"dns-priority": "0",
"addresses": "192.168.1.3/24",
"gateway": "",
"routes": "192.168.10.0/24 192.168.1.1",
"route-metric": "-1",
"route-table": "0",
"ignore-auto-routes": "no",
"ignore-auto-dns": "no",
"dhcp-client-id": "",
"dhcp-timeout": "0",
"dhcp-send-hostname": "yes",
"dhcp-hostname": "",
"dhcp-fqdn": "",
"never-default": "no",
"may-fail": "yes",
"dad-timeout": "-1"
}
},
{
"connection": {
...
},
}
]
OR
{
"connection": {
...
},
}
What i tried to do is:
1. Transform json to strings
data=$(jq -r 'def keyValue: to_entries[] | "[\(.key)]\\n\(.value | to_entries|map("\(.key)=\(.value)" ) | join("\\n") )\\n"; if type == "array" then keys[] as $k | "\(.[$k] | .connection.id)=\(.[$k] | keyValue)" elif type == "object" then "\(.connection.id)=\(. | keyValue)" else keys end' /tmp/json)
Provides:
br1=[connection]\nid=br1\nuuid=ab1dd903-4786-4c7e-a4b4-3339b144d6c7\nstable-id=\ntype=fff\ninterface-name=br0\nautoconnect=no\nautoconnect-priority=0\nautoconnect-retries=-1\nauth-retries=-1\ntimestamp=1525494904\nread-only=no\npermissions=\nzone=WAN\nmaster=\nslave-type=\nautoconnect-slaves=1\nsecondaries=\ngateway-ping-timeout=0\nmetered=unknown\nlldp=default\n
br1=[802-3-ethernet]\nport=\nspeed=0\nduplex=\nauto-negotiate=no\nmac-address=\ncloned-mac-address=\ngenerate-mac-address-mask=\nmac-address-blacklist=\nmtu=1500\ns390-subchannels=\ns390-nettype=\ns390-options=\nwake-on-lan=default\nwake-on-lan-password=\n....
2. Walk over strings in bash
while IFS="=" read -r key value; do [ "$oldkey" = "$key" ] && echo -en "$value" >> "/tmp/ini/$key" || echo -en "$value" > "/tmp/ini/$key" ; oldkey="$key"; done <<< "$data"
Gives:
[connection]
id=br1
uuid=ab1dd903-4786-4c7e-a4b4-3339b144d6c7
stable-id=
type=fff
interface-name=br0
autoconnect=no
autoconnect-priority=0
autoconnect-retries=-1
auth-retries=-1
timestamp=1525494904
read-only=no
permissions=
zone=WAN
master=
slave-type=
autoconnect-slaves=1
secondaries=
gateway-ping-timeout=0
metered=unknown
lldp=default
[ipv4]
method=manual
dns=192.168.1.1,192.168.2.1
dns-search=
dns-options=
dns-priority=0
addresses=192.168.1.3/24
gateway=
routes=192.168.10.0/24 192.168.1.1
route-metric=-1
route-table=0
ignore-auto-routes=no
ignore-auto-dns=no
dhcp-client-id=
dhcp-timeout=0
dhcp-send-hostname=yes
dhcp-hostname=
dhcp-fqdn=
never-default=no
may-fail=yes
dad-timeout=-1
I'm almost there! But is there possible to do it more "elegantly" and more performance way, avoiding pipes, external calls, etc.
Note: Moreover, mostly it should be done with jq + bash, because other processing tools like sed, awk is slower than i've done, but i do not reject them completely =)
P.S. - Main purpose of this transforming is the fast "bulk operation" to write ini files
To convert an array as shown to the ".ini" format could be accomplished by simply using this jq program:
def kv: to_entries[] | "\(.key)=\(.value)";
.[]
| to_entries[]
| "[\(.key)]", (.value|kv)
Putting this in a file, say program.jq, then assuming the JSON shown in the question (minus the "..." part) is in input.json, the following invocation:
jq -rf program.jq input.json
yields the corresponding ".ini" file.
If you want to ensure that the program will also handle the case when there is no enclosing array, you could modify the first line in the main program above to test whether the input is an array, so you'd have:
if type == "array" then .[] else . end
| to_entries[]
| "[\(.key)]", (.value|kv)
If the ultimate goal is to produce several .ini files, then we can reuse def kv as defined in the other answer:
def kv: to_entries[] | "\(.key)=\(.value)";
The driver program would however now be:
.[]
| [to_entries[] | "[\(.key)]", (.value|kv)]
| join("\n")
Running jq with this program then yields one JSON string for each item in the array. Using the example, the first such string would be:
"[connection]\nid=br0\nuuid=ab1dd903-4786-4c7e-a4b4-3339b144d6c7\nstable-id=\ntype=bridge\ninterface-name=br0\nautoconnect=no\nautoconnect-priority=0\nautoconnect-retries=-1\nauth-retries=-1\ntimestamp=44444\nread-only=no\npermissions=\nzone=WAN\nmaster=\nslave-type=\nautoconnect-slaves=1\nsecondaries=\ngateway-ping-timeout=0\nmetered=unknown\nlldp=default\n[ipv4]\nmethod=manual\ndns=192.168.1.1,192.168.2.1\ndns-search=\ndns-options= \ndns-priority=0\naddresses=192.168.1.3/24\ngateway=\nroutes=192.168.10.0/24 192.168.1.1\nroute-metric=-1\nroute-table=0\nignore-auto-routes=no\nignore-auto-dns=no\ndhcp-client-id=\ndhcp-timeout=0\ndhcp-send-hostname=yes\ndhcp-hostname=\ndhcp-fqdn=\nnever-default=no\nmay-fail=yes\ndad-timeout=-1"
You can then iterate over these newline-delimited strings.