my idea is to put json string into variable JSON, i have a command whom takes a console login to user in AWS IAM, the comand is : aws iam create-login-profile --cli-input-json file://create-login-profile.json
in create-login-profile.json is the json following content :
{
"UserName": "roberto.viquezzz",
"Password": "aaaaaaaaaaa",
"PasswordResetRequired": true}
i try to write bash script whom contains json var "JSON" like following code:
JSON="{\"UserName\": \"roberto.viquezzz\",\"Password\": \"aaaaaaaaaaa\",\"PasswordResetRequired\" : true}"
aws iam create-login-profile --cli-input-json $JSON
and if i type in console ./file.sh the file execute create a console user.
if it try to execute this code i get an error : Unknown options: "aaaaaaaaaaa","PasswordResetRequired", :, true}, "roberto.viquezzz","Password":
but if i execute this code from command line like :
aws iam create-login-profile --cli-input-json "{\"UserName\": \"roberto.viquezzz\",\"Password\": \"aaaaaaaaaaa\",\"PasswordResetRequired\": true}"
all is be ok , maybe whom know whats wrong ? please suggest!
Put quotes around $JSON:
aws iam create-login-profile --cli-input-json "$JSON"
The quotes that are there during the assignment get consumed by the shell. You can verify this by issuing echo $JSON.
By adding the quotes you will make sure that the entire string is passed to the command "aws" as a single argument.
Related
I have 4 environment variables on my laptop, ami_id, instance_type, key_name and security_group_ids. I am trying to create a launch template version using these variables but I do not know how to pass them into the JSON array properly
aws ec2 create-launch-template-version --launch-template-id lt-xxx --launch-template-data '{"ImageId":"$ami_id", "InstanceType": "$instance_type", "KeyName": "$key_name", "SecurityGroupIds": ["$security_group_ids"]}'
An error occurred (InvalidAMIID.Malformed) when calling the CreateLaunchTemplateVersion operation: The image ID '$ami_id' is not valid. The expected format is ami-xxxxxxxx or ami-xxxxxxxxxxxxxxxxx.
Using a here-document allows you to feed some readable text into a variable while expanding the shell variables, like this:
#!/bin/sh
ami_id=ami1234
instance_type=t3.nano
key_name=key1
security_group_ids=sg123,sg456
template_data=$(cat <<EOF
{
"ImageId":"$ami_id",
"InstanceType": "$instance_type",
"KeyName": "$key_name",
"SecurityGroupIds": ["$security_group_ids"]
}
EOF
)
echo "$template_data"
You can then test the JSON syntax with jq:
./template.sh | jq -c
{"ImageId":"ami1234","InstanceType":"t3.nano","KeyName":"key1","SecurityGroupIds":["sg123,sg456"]}
In terraform,I am trying to make a PUT call via curl command using null_resource and executing the command in local_exec provisioner. The body of the request is json array.
The command expects the data to be in string format. I have the data as tuple in locals. I am using jsonencode to serialize the tupple to a string and pass to curl. But on jsonencode, the string formed has an additional \ prefixed before each " in json string.
eg. expected json string is:
"[{"key-1" : "value-1", "key-2": "value-2"}]"
But after jsonencode, the formatted string is as follows:
"[{\"key-1\" : \"value-1\", \"key-2\": \"value-2\"}]"
additional backslash is added along with quotes.
Because of the malformed request body, the API returns Bad Request response.
How can we correctly serialize the tupple/list to correct json string in Terraform and pass to curl command?
This is my null_resource:
resource "null_resource" "call_API" {
for_each = { for x in local.array: x.name => {
name = x.name
vars = jsonencode(x.vars)
} }
provisioner "local-exec" {
command = <<EOF
curl -X PUT ${var.url} -d ${each.value.vars} -H 'Content-Type:application/json' -H 'Authorization:${local.bearer_token}'
EOF
}
I think I understand that you want to pass this variable
"[{"key-1" : "value-1", "key-2": "value-2"}]"
You tried to use EOF?
You can do something like that
variable "body" {
type = "string"
}
variable = <<EOF
"[{"key-1" : "value-1", "key-2": "value-2"}]"
EOF
Or with EOT, honestly I don't remember which one work
EDIT:
If the json is not static and you receive it from other module or others, you can try to expand the variable inside the EOFs
body = <<EOF
${var.json}
EOF
I am trying to submit a spark job where I am setting a date argument in conf property and I am running it through a script in NiFi. However, when I am running the script I am facing an error.
Spark Submit Code in the script:
aws emr add-steps --cluster-id "$1" --steps '[{"Args":["spark-submit","--deploy-mode","cluster","--jars","s3://tvsc-lumiq-edl/jars/ojdbc7.jar","--executor-memory","10g","--driver-memory","10g","--conf","spark.hadoop.yarn.timeline-service.enabled=false","--conf","currDate='\"$5\"'","--class",'\"$2\"','\"$3\"','\"$4\"'],"Type":"CUSTOM_JAR","ActionOnFailure":"CONTINUE","Jar":"command-runner.jar","Properties":"","Name":"Spark application"}]' --region "$6"
and after I run it, I get the below error:
ExecuteStreamCommand[id=5b08df5a-1f24-3958-30ca-2e27a6c4becf] Transferring flow file StandardFlowFileRecord[uuid=00f844ee-dbea-42a3-aba3-0edcabfc50a2,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1607082757752-507103, container=default, section=223], offset=29, length=-1],offset=0,name=6414901712887990,size=0] to nonzero status. Executable command /bin/bash ended in an error:
Error parsing parameter '--steps': Invalid JSON:
[{"Args":["spark-submit","--deploy-mode","cluster","--jars","s3://tvsc-lumiq-edl/jars/ojdbc7.jar","--executor-memory","10g","--driver-memory","10g","--conf","spark.hadoop.yarn.timeline-service.enabled=false","--conf","currDate="Fri
Where am I going wrong?
You can use JSONLint to validate your JSON, which makes it easier to see why its wrong.
In your case, you are wrapping the final 3 values in single quotes ' rather than double quotes "
Your steps JSON should look like:
[{
"Args": [
"spark-submit",
"--deploy-mode",
"cluster",
"--jars",
"s3://tvsc-lumiq-edl/jars/ojdbc7.jar",
"--executor-memory",
"10g",
"--driver-memory",
"10g",
"--conf",
"spark.hadoop.yarn.timeline-service.enabled=false",
"--conf",
"currDate='\"$5\"'",
"--class",
"\"$2\"",
"\"$3\"",
"\"$4\""
],
"Type": "CUSTOM_JAR",
"ActionOnFailure": "CONTINUE",
"Jar": "command-runner.jar",
"Properties": "",
"Name": "Spark application"
}]
Specifically, these 3 lines:
"\"$2\"",
"\"$3\"",
"\"$4\""
Instead of the original:
'\"$2\"',
'\"$3\"',
'\"$4\"'
I have created aws lambda function in .net core and deployed.
I have tried executing function in aws console with test case and its working. but i am not able achieve the same with cli command
aws lambda invoke --function-name "mylambda" --log-type Tail --payload file://D:/Files/lamdainputfile.json file://D:/Files/response.txt
i got getting error with cli command
An error occurred (InvalidRequestContentException) when calling the Invoke operation: Could not parse request body into json: Unexpected character ((CTRL-CHAR, code 138)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: (byte[])"�zn�]t�zn�m�"; line: 1, column: 2]
I tried passing json
aws lambda invoke --function-name "mylambda" --log-type Tail --payload "{'input1':'100', 'input2':'200'}" file://D:/Files/response.txt
but it's not working
This lambda function is executing aws console with test case and giving correct result. I have added same input in local json file and tried with cli command.
Json input:
{
"input1": "100",
"input2": "200"
}
EDIT:
After correction in inline json i am getting error for output file
Unknown options: file://D:/Files/response.txt
is there any command to print output in cli only?
The documentation is not updated from the cli version 1. For the aws cli version 2 we need to base64 encode the payload.
Mac:
payload=`echo '{"input1": 100, "input2": 200 }' | openssl base64`
aws lambda invoke --function-name myfunction --payload "$payload" SomeOutFile &
Adding the option --cli-binary-format raw-in-base64-out will allow you to pass raw json in the invoke command.
aws lambda invoke \
--cli-binary-format raw-in-base64-out \
--function-name "mylambda" \
--payload '{"input1": "100", "input2": "200"}' \
file://D:/Files/response.txt
Based on ASW CLI invoke command options --payload only accepts inline blob arguments (i.e. JSON). In other words --payload parameter can not be used to read input from a file, so --payload file://D:/Files/lamdainputfile.json will not work.
In the example provided what probably happens is --payload is ignored, file://D:/Files/lamdainputfile.json is treated as <outfile>, and an error is raised for file://D:/Files/response.txt as it is an unexpected positional argument.
What is required is reading the contents of D:/Files/lamdainputfile.json with a separate command. How this can be done is different based on the type of shell used. Bash example:
aws lambda invoke --payload "$(cat /path/to/input.json)" ...
Original answer:
I don't know about the first case (--payload file:///...), however the second case is not a valid JSON, as JSON requires strings to be double quoted. Try the following JSON:
{
"input": "100",
"input2": "200"
}
I am trying to post a json string using curl in scala. My curl command works fine if executed from linux box but throes an error(("message": "Must provide query string.",) always from scala.
my working curl command in linux:
curl http://laptpad1811:5000/graphql -H "Content-Type: application/json"
-X POST -d '{"query":"mutation
CreateFileReceivedEvent($createFileReceivedEventInput:
CreateFleReceivedEventInput!) { createFileReceivedEvent(input:
$createFileReceivedEventInput) { clientMutationId }}","variables":
{"createFileReceivedEventInput":
{"clientMutationId":"Test","fileReceivedEvent":{"file":
{"fileTrackingId":"83a86c44-66a5-4de0-9b7f-
c6995877279d","name":"textfile_2017-08-21T15:58:45Z","fileType":
{"code":"textfile"}},"eventTimestamp":"2017-08-
21T15:59:30Z"}}},"operationName":"CreateFileReceivedEvent"}'
My scala code:
step1: copying the entire json string(pay load) to txt file
'{"query":"mutation CreateFileReceivedEvent($createFileReceivedEventInput:
CreateFleReceivedEventInput!) { createFileReceivedEvent(input:
$createFileReceivedEventInput) { clientMutationId }}","variables":
{"createFileReceivedEventInput":
{"clientMutationId":"Test","fileReceivedEvent":{"file":
{"fileTrackingId":"83a86c44-66a5-4de0-9b7f-
c6995877279d","name":"textfile_2017-08-21T15:58:45Z","fileType":
{"code":"textfile"}},"eventTimestamp":"2017-08-
21T15:59:30Z"}}},"operationName":"CreateFileReceivedEvent"}'
step2:
val data=fromFile("/usr/test/data.txt").getLines.mkString
step3:
val cmd = Seq("curl", "http://laptpad1811:5000/graphql", "-H",
"'Content-Type:application/json'" ,"-X", "POST", "-d" , data)
step4:
cmd.!!
I get the below error
String =
"{
"errors": [
{
"message": "Must provide query string.",
"stack": "BadRequestError: Must provide query string.\n
I have tried to change " to ' and mutiple combinations of the json string but I always get the same error.
I suspect that your issue is that sys.process doesn't pass commands through the shell (e.g. bash), so quotes that are necessary in the shell become unnecessary in Scala (and get passed through to the command which in the case of Unix-style utilities will probably result in unexpected behavior).
So try:
val cmd = Seq("curl", "http://laptpad1811:5000/graphql", "-H", "Content-Type: application/json", "-X", "POST", "-d", data)
Likewise remove the single quote wrapping from your text file.
I would however, counsel against spawning a curl in Scala and advise using one of the existing http client libraries (I personally like Gigahorse).