I have 4 environment variables on my laptop, ami_id, instance_type, key_name and security_group_ids. I am trying to create a launch template version using these variables but I do not know how to pass them into the JSON array properly
aws ec2 create-launch-template-version --launch-template-id lt-xxx --launch-template-data '{"ImageId":"$ami_id", "InstanceType": "$instance_type", "KeyName": "$key_name", "SecurityGroupIds": ["$security_group_ids"]}'
An error occurred (InvalidAMIID.Malformed) when calling the CreateLaunchTemplateVersion operation: The image ID '$ami_id' is not valid. The expected format is ami-xxxxxxxx or ami-xxxxxxxxxxxxxxxxx.
Using a here-document allows you to feed some readable text into a variable while expanding the shell variables, like this:
#!/bin/sh
ami_id=ami1234
instance_type=t3.nano
key_name=key1
security_group_ids=sg123,sg456
template_data=$(cat <<EOF
{
"ImageId":"$ami_id",
"InstanceType": "$instance_type",
"KeyName": "$key_name",
"SecurityGroupIds": ["$security_group_ids"]
}
EOF
)
echo "$template_data"
You can then test the JSON syntax with jq:
./template.sh | jq -c
{"ImageId":"ami1234","InstanceType":"t3.nano","KeyName":"key1","SecurityGroupIds":["sg123,sg456"]}
Related
This question already has answers here:
Convert JSON from AWS SSM to environment variables using jq
(2 answers)
Closed 1 year ago.
I am fetching a AWS Parameter Store JSON response using the AWS cli:
echo $(aws ssm get-parameters-by-path --with-decryption --region eu-central-1 \
--path "${PARAMETER_STORE_PREFIX}") >/home/ubuntu/.env.json
This outputs something like:
{
"Parameters": [
{
"Name": "/EXAMPLE/CORS_ORIGIN",
"Value": "https://example.com"
},
{
"Name": "/EXAMPLE/DATABASE_URL",
"Value": "db://user:pass#host:3306/example"
}
]
}
Instead of writing to a JSON file, I'd like to write to a .env file instead with the following format:
CORS_ORIGIN="https://example.com"
DATABASE_URL="db://user:pass#host:3306/example"
How could I achieve this? I found similar questions like Exporting JSON to environment variables, but they do not deal with nested JSON / arrays
Easy with jq and some string interpolation and #sh to properly quote the Value strings for shells:
$ jq -r '.Parameters[] | "\(.Name | .[rindex("/")+1:])=\(.Value | #sh)"' input.json
CORS_ORIGIN='https://example.com'
DATABASE_URL='db://user:pass#host:3306/example'
Now that Powershell is Open Source and cross platform (with Powershell Core), I thought I'd give it a try again.
I had used in the past and at some point had figured how pipelines work, but it's not super intuitive. It should be, but it isn't, so I'm somewhat stuck...
Task at hand: parse and print several fields from the JSON output of a command. I could probably do it the old fashioned way, with external commands and string processing, à la bash, but I want to learn how to do it the Powershell way™.
And to clarify, I want to do it interactively, in a pipeline. I don't want to write a script, I want to learn how to do this kind of processing in a one-liner (or 2, at most, but basically with Powershell working as a REPL, not as a script tool).
Bash command:
aws cloudformation describe-stack-events --stack-name some-stack-here | jq ".StackEvents[] | [.Timestamp, .ResourceStatus, .ResourceType, .ResourceStatusReason] | join(\" \")"
What this does is take the input JSON and print just 3 fields I'm interested in.
Input JSON:
{
"StackEvents": [
{
"StackId": "arn:aws:cloudformation:some-region-here:some-number-here:stack/some-stack-here/some-id-here",
"EventId": "some-event-id-here",
"ResourceStatus": "UPDATE_COMPLETE",
"ResourceType": "AWS::CloudFormation::Stack",
"Timestamp": "some-date-here",
"StackName": "some-stack-here",
"PhysicalResourceId": "arn:aws:cloudformation:some-region-here:some-number-here:stack/some-stack-here/some-id-here",
"LogicalResourceId": "some-stack-here"
},
{
"StackId": "arn:aws:cloudformation:some-region-here:some-number-here:stack/some-stack-here/some-id-here",
"EventId": "some-event-id-here",
"ResourceStatus": "UPDATE_COMPLETE_CLEANUP_IN_PROGRESS",
"ResourceType": "AWS::CloudFormation::Stack",
"Timestamp": "some-date-here",
"StackName": "some-stack-here",
"PhysicalResourceId": "arn:aws:cloudformation:some-region-here:some-number-here:stack/some-stack-here/some-id-here",
"LogicalResourceId": "some-stack-here"
}
]
}
Command output:
"some-date-here UPDATE_COMPLETE AWS::CloudFormation::Stack "
"some-date-here UPDATE_COMPLETE_CLEANUP_IN_PROGRESS AWS::CloudFormation::Stack "
(I think this should be on Stackoverflow because the topic involves a pretty solid understanding of Powershell concepts, including .NET objects, much closer to programming than to sysadmining, i.e. SuperUser or so.)
You can do:
$j = aws cloudformation describe-stack-events --stack-name some-stack-here | ConvertFrom-Json
$j.StackEvents | % { "{0} {1} {2}" -f $_.Timestamp, $_.Resourcestatus, $_.ResourceType }
Use the ConvertFrom-Json cmdlet to store the command result into a powershell object, then you can select the StackEvents array and loop it to select the required values.
If you wanted to do it in one line:
(aws cloudformation describe-stack-events --stack-name some-stack-here | ConvertFrom-Json).StackEvents | %
{ "{0} {1} {2}" -f $_.Timestamp, $_.Resourcestatus, $_.ResourceType }
This is seemingly a very basic question. I am new to JSON, so I am prepared for the incoming facepalm.
I have the following JSON file (test-app-1.json):
"application.name": "test-app-1",
"environments": {
"development": [
"server1"
],
"stage": [
"server2",
"server3"
],
"production": [
"server4",
"server5"
]
}
The intent is to use this as a configuration file and reference for input validation.
I am using bash 3.2 and will be using jq 1.4 (not the latest) to read the JSON.
The problem:
I need to return all values in the specified JSON array based on an argument.
Example: (how the documentation and other resources show that it should work)
APPLICATION_ENVIRONMENT="developement"
jq --arg appenv "$APPLICATION_ENVIRONMENT" '.environments."$env[]"' test-app-1.json
If executed, this returns null. This should return server1.
Obviously, if I specify text explicitly matching the JSON arrays under environment, it works just fine:
jq 'environments.development[]' test-app-1.json returns: server1.
Limitations: I am stuck to jq 1.4 for this project. I have tried the same actions in 1.5 on a different machine with the same null results.
What am I doing wrong here?
jq documentation: https://stedolan.github.io/jq/manual/
You have three issues - two typos and one jq filter usage issue:
set APPLICATION_ENVIRONMENT to development instead of developement
use variable name consistently: if you define appenv, use $appenv, not $env
address with .environments[$appenv]
When fixed, looks like this:
$ APPLICATION_ENVIRONMENT="development"
$ jq --arg appenv "$APPLICATION_ENVIRONMENT" '.environments[$appenv][]' test-app-1.json
"server1"
I'd like to import a json file containing an array of jsons into a mysql table.
Using the LOAD DATA LOCAL INFILE '/var/lib/mysql-files/big_json.json' INTO TABLE test(json); give the error messageInvalid JSON text: "Invalid value." at position 1 in value for column
My json looks like the following:
[
{
"id": "6defd952",
"title": "foo"
},
{
"id": "98ee3d8b",
"title": "bar"
}
]
How can I parameterize the load command to iterate through the array?
For whatever reason, MySQL errors out when you try and load pretty-printed JSON, both in 5.7 and in 8.0+, so you'll need the JSON to be in compact form, i.e.:
[{"id":"6defd952","title":"foo"},{"id":"98ee3d8b","title":"bar"}]
This can be accomplished by reading the json object to jq and using the -c flag to output in compact form to re-format the text:
Windows:
type input.json | jq -c . > infile.json
Mac/Linux:
cat input.json | jq -c . > infile.json
If you're reading from a std_out instead of a file, just replace the cat/type section and pipe to jq in the same manner. The resultant "infile.json" should then work for the LOAD DATA INFILE statement.
If you wanted to then iterate through the loaded array, use JSON_EXTRACT(JSON_FIELD, CONCAT('$[', i, ']')) with ordinal variable i to traverse the root document (array) in a single query.
I am trying to access the field "my-tag" from the following json using jq from a shell script:
json file:
{
"tag": {
"value": "hello"
},
"my-tag": {
"value": "hello-my-tag"
}
}
Shell script:
#!/bin/bash
main()
{
search="my-tag"
file="myjson.json"
value=($(jq ".$search.value" "$file"))
echo $value
}
main "$#"
On executing this script, I get the following error:
error: tag is not defined
.-tag.value
^^^
1 compile error
How can I extract the field correctly in the shell script?
As a general rule of thumb, I would strongly suggest not placing shell variables within your filters, and use arguments to pass them in.
jq --arg search "$search" '.[$search].value' "$file"
Your filter string is an equivalent to a script. You wouldn't want to modify your script every time you wanted to change a value, you would parameterize it.
You'll have to use the associative array notation for arbitrary key strings
jq ".[\"$search\"].value" "$file