bash causes invalid json for ec2 cli request - json

I am using a bash script to dynamically create an EC2 CLI request. When the bash script is executed the AWS CLI returns Error parsing parameter '--launch-specification': Invalid JSON:, but if I copy the CLI string and submit it directly via the CLI, the CLI command works no problems.
Could / is the bash script generating some code characters that cause the CLI request to fail that are not present when I use copy/paste in the terminal?
BASH SCRIPT CODE
CMD01=("aws --profile ${myProf} --region ${myRegion} ec2 request-spot-instances --spot-price ${PRICE} --instance-count ${6} --type \"one-time\" --launch-specification \"{\\\"ImageId\\\":\\\"${1}\\\",\\\"KeyName\\\":\\\"${2}\\\",\\\"InstanceType\\\":\\\"${!5}\\\",\\\"IamInstanceProfile\\\":{\\\"Arn\\\":\\\"${16}\\\"},\\\"Placement\\\":{\\\"AvailabilityZone\\\":\\\"${18}\\\",\\\"GroupName\\\":\\\"${11}\\\"},\\\"NetworkInterfaces\\\":[{\\\"DeviceIndex\\\":0,\\\"SubnetId\\\":\\\"${4}\\\",\\\"AssociatePublicIpAddress\\\":${17}}],\\\"UserData\\\":\\\"string\\\"}\" --dry-run")
echoed via
echo "$CMD01"
aws --profile myProfile --region eu-west-1 ec2 request-spot-instances --spot-price 0.004 --instance-count 1 --type "one-time" --launch-specification "{\"ImageId\":\"ami-9c7ad8eb\",\"KeyName\":\"myKey\",\"InstanceType\":\"t1.micro\",\"IamIns tanceProfile\":{\"Arn\":\"arn:aws:iam::000000000000:instance-profile/myprofile\"},\"Placement\":{\"AvailabilityZone\":\"eu-west-1c\",\"GroupName\":\"myGroup\"},\"NetworkInterfaces\":[{\"DeviceIndex\":0,\"SubnetId\":\"subnet-xxxyyy\",\"AssociatePublicIpAddress\":true}],\"UserData\":\"string\"}" --dry-run
executed via ${CMD01[#]} > $logFile
generates error
Error parsing parameter '--launch-specification': Invalid JSON:
"{\"ImageId\":\"ami-9c7ad8eb\",\"KeyName\":\"myKey\",\"InstanceType\":\"t1.micro\",\"IamInstanceProfile\":{\"Arn\":\"arn:aws:iam::000000000000:instance-profile/myprofile\"},\"Placement\":{\"AvailabilityZone\":\"eu-west-1c\",\"GroupName\":\"myGroup\"},\"NetworkInterfaces\":[{\"DeviceIndex\":0,\"SubnetId\":\"subnet-xxxyyy\",\"AssociatePublicIpAddress\":true}],\"UserData\":\"string\"}"
now if I take the earlier echo echo "$CMD01" from the terminal and do a simple copy/paste, the CLI output
A client error (DryRunOperation) occurred when calling the RequestSpotInstances operation: Request would have succeeded, but DryRun flag is set.
So it seems the JSON is valid, but when executed from the bash scrip it is invalid. What am I doing wrong?

I think you have a bash error.
Try with:
CMD01=$(aws commands commands...)

Related

Amazon AWS CLI not allowing valid JSON in payload parameter

I am getting an error when I try and invoke a lambda function from the AWS CLI. I am using version 2 of the CLI.
I understand that I should pass the --payload argument as a string containing a JSON object.
aws lambda invoke --function-name testsms --invocation-type Event --payload '{"key": "test"}' response.json
I get the following error:
Invalid base64: "{"key": "test"}"
I have tried all sorts of variants for the JSON escaping characters etc. I have also tried to use the file://test.json option I receive the same error.
As #MCI said, AWS V2 defaults to base 64 input. For your case to work, simply add a --cli-binary-format raw-in-base64-out parameter to your command, so it'd be
aws lambda invoke --function-name testsms \
--invocation-type Event \
--cli-binary-format raw-in-base64-out \
--payload '{"key": "test"}' response.json
Looks like awscli v2 requires some parameters be base64-encoded.
By default, the AWS CLI version 2 now passes all binary input and binary output parameters as base64-encoded strings. A parameter that requires binary input has its type specified as blob (binary large object) in the documentation.
The payload parameter to lamba invoke is one of these blob types that must be base64-encoded.
--payload (blob)
The JSON that you want to provide to your Lambda function as input.
One solution is to use openssl base64 to encode your payload.
echo '{"key": "test"}' > clear_payload
openssl base64 -out encoded_payload -in clear_payload
aws lambda invoke --function-name testsms --invocation-type Event --payload file://~/encoded_payload response.json
Firstly, a string is a valid json.
In my case I had this problem
$ aws --profile diegosasw lambda invoke --function-name lambda-dotnet-function --payload "Just Checking If Everything is OK" out
An error occurred (InvalidRequestContentException) when calling the Invoke operation: Could not parse request body into json: Could not parse payload into json: Unrecognized token 'Just': was expecting ('true', 'false' or 'null')
at [Source: (byte[])"Just Checking If Everything is OK"; line: 1, column: 6]
and it turns out the problem was due to the AWS CLI trying to convert it to JSON. Escaping the double quotes did the trick
$ aws --profile diegosasw lambda invoke --function-name lambda-dotnet-function --payload "\"Just Checking If Everything is OK\"" out
{
"StatusCode": 200,
"ExecutedVersion": "$LATEST"
}
In Windows, I have tried the following, which worked for me
aws lambda invoke --function-name testsms --invocation-type Event --cli-binary-format raw-in-base64-out --payload {\"key\": \"test\"} response.json
Note that, added --cli-binary-format raw-in-base64-out in the command and escaped " to \" in payload
This solution worked for me and I find it simpler than having to remember/check the man page for the correct flags each time.
aws lambda invoke --function-name my_func --payload $(echo "{\"foo\":\"bar\"}" | base64) out
On my windows PowerShell running LocalStack I had to use:
--payload '{\"key\": \"test\"}' response.json

Filtering with Azure CLI and JMESPath for network vnet peering list

I'm using the Azure CLI to get a list of vnet peerings: az network vnet peering list. This returns json in the following structure:
[
{
...
"name": "prefix1-name",
...
},
{
...
"name": "prefix2-name",
...
}
]
I am trying to filter the results by names starting with some prefix. I have tried various combinations of the following:
az network vnet peering list --resource-group my-rg --vnet-name my-vnet --query "[?starts_with(name,'prefix1-')].{name}"
However this always fails with a message like ].{name} was unexpected at this time.
What am I missing?
Try to use az network vnet peering list --resource-group my-rg --vnet-name my-vnet --query "[?starts_with(name,'prefix1-')].name". You do not need to include the {}in the name. This works on my side.
Edit
For the error message, I can reproduce it with these Azure CLI commands in PowerShell locally on the windows 10 machine. But this does not appear in my local Linux Bash Shell as the above screenshots. This error also does not appear on PowerShell and Bash in the Azure cloud shell.
By my validation, It looks like a specific scenario on local PowerShell. When you filter with JMESPath, I just tried that the function starts_with or contains requires a space between the function, it works like these:
--query "[?starts_with(name, 'vnet')].name"
--query "[?starts_with (name,'vnet')].name"
--query "[?starts_with(name,'vnet') ].name"
--query "[?starts_with(name,'vnet')] .name"
but this does not work as below.
--query "[?starts_with(name,'vnet')].name"
This is a Windows PowerShell issue: https://github.com/Azure/azure-cli/blob/dev/doc/use_cli_effectively.md#argument-parsing-issue-in-powershell. To workaround it, insert --% after az to force PowerShell to treat the remaining characters in the line as a literal.
The answer was apparently putting a space between the ) and ] so az network vnet peering list --resource-group my-rg --vnet-name my-vnet --query "[?starts_with(name,'prefix1-') ].name worked for me. No idea why and not found any documentation that mentions anything like this. If I remove that space I get the ].name was unexpected at this time message.

Zabbix Discovery with External check JSON

Zabbix 3.2.5 in docker on alpine image (official build)
I have some problem with external script and returned JSON.
The script json_data.sh is:
#!/bin/bash
# Generate JSON data for zabbix
declare -i i
fields=$1
data=($2)
json=""
i=0
while [ $i -lt ${#data[*]} ]; do
row=""
for f in $fields; do
row+="\"{#$f}\":\"${data[$i]}\","
i+=1
done
json+="{${row%,}},"
done
echo "{\"data\":[${json%,}]}"
key string is:
json_data.sh["IP", "127.0.0.1 127.0.0.2 127.0.0.3"]
I test it with text item and have result
2539:20170515:095829.375 zbx_popen(): executing script
{"data":[{"{#IP}":"127.0.0.1"},{"{#IP}":"127.0.0.2"},{"{#IP}":"127.0.0.3"}]}
So script returns valid JSON but i still have error Vallue Should be JSON object in service discovery.
What wrong with that JSON?
Template Settings In screenshot {$IPLIST} just macro = "127.0.0.1 127.0.0.2 127.0.0.3"
Error
This is bug. When DebugLevel is more than 3 Zabbix mix part of the debug output with the value data. Something like zbx_popen(): executing script.
Solution to reduce DebugLevel to 3 or lower, and wait until ZBX-12195 will be fixed.

How to pass an array to a jenkins parameterized job via remote access api?

I am trying to call a Jenkins parameterized job using curl command. I am following Remote API Jenkins.
I have Active choice parameter plugin. One of the parameters of the job is an Active choice reactive parameter.
Here is the screenshot of the job:
I am using the following curl command to trigger it with parameter:
curl -X POST http://localhost:8080/job/active-choice-test/buildWithParameters -u abhishek:token --data-urlencode json='{"parameter": [{"name":"state", "value":"Maharashtra"},{"name":"cities", "value":["Mumbai", "Pune"]}]}'
But I am not able to pass the cities parameter which should be a json array. The above script is giving error.
I am printing the state & cities variable like this:
The job is getting executed and showing error for cities:
Started by user abhishek
Building in workspace /var/lib/jenkins/workspace/active-choice-test
[active-choice-test] $ /bin/sh -xe /tmp/hudson499503098295318443.sh
+ echo Maharashtra
Maharashtra
+ echo error
error
Finished: SUCCESS
Please tell me how to pass array parameter to a jenkins parameterized job while using remote access API?
You may change the value to strings rather than array:
curl -X POST http://localhost:8080/job/active-choice-test/buildWithParameters -u abhishek:token --data-urlencode json='{"parameter": [{"name":"state", "value":"Maharashtra"},{"name":"cities", "value":"Mumbai,Pune"}]}'

Pass json into AWS CLI Invalid JSON

I have a script to add some custom metric data and it works great of i write the metric data to a file and then read that in like:
aws cloudwatch put-metric-data --namespace "ec2" --metric-data file://metric2.json
But if i have the script just print and call it like this:
aws cloudwatch put-metric-data --namespace "ec2" --metric-data $(python aws-extra-metrics.py)
I get the following error:
Error parsing parameter '--metric-data': Invalid JSON:
Is their any way around this i would prefer not to have to write it to a file everytime as this will be ran from a cronjob.
We are running ubunutu
is the python script generating the json file? The difference is between passing a file name and passing the file content.
You could try:
python aws-extra-metrics.py > metric2.json && aws cloudwatch put-metric-data --namespace "ec2" --metric-data file://metric2.json
or
aws cloudwatch put-metric-data --namespace "ec2" --metric-data $(python aws-extra-metrics.py)
you may need quotes around the invocation of the python script