AWS SNS how to add line breaks in message - json

I'm trying to send SNS messages via CLI in json format.
aws sns publish --cli-input-json "{\"TopicArn\":\"xxx\",\"Message\":\"first line\n second line\",\"Subject\":\"Empty subject\"}"
But the \n doesn't work. Neither is "\r\n" or "\n". I think the string is escaped by SNS so \n doesn't work. Does anyone know how to send a message of 2 lines?(Sending 2 messages is not an option) Appreciate your advice!

I think \\n is actually what you are looking for. I've just tested it by sending push notifications to my device through AWS SNS.
So your message should look like this:
aws sns publish --cli-input-json "{\"TopicArn\":\"xxx\",\"Message\":\"first line\\nsecond line\",\"Subject\":\"Empty subject\"}"
Note, you should not leave the white space after the line break symbol, otherwise, your new line would start with that space.

aws sns publish --topic-arn "arn:aws:sns:us-west-2:0123456789012:my-topic" --message file://message.txt
message.txt is a text file containing the message to publish:
Hello World
Second Line
Putting the message in a text file allows you to include line breaks
.

This worked out for me:
"first line
second line"

I am publishing messages using the email protocol using the NodeJs aws-sdk. In order for exceptions to appear correctly, I needed to replace both \n and \\n, and to appease both windows and mac clients, used \r\n.
message.replace(/\n|\\n/g, '\r\n')
For anyone who needs full code, this is how I am handling errors in typescript
public prepareMessage(header: string, error: any) {
const data = (error instanceof Error)
? JSON.stringify(error, Object.getOwnPropertyNames(error), 2)
: JSON.stringify(error, null, 2);
const replaceNewlines = (str: string) => str?.replace(/\n|\\n/g, '\r\n') || '';
return `${replaceNewlines(header)}\r\n${replaceNewlines(data)}`;
}

four backslash
works for me
using Aws SNS with Firebase
EX: backslashbackslashbackslashbackslash+n

After testing all suggested answers, here's what worked in my case (running from a python lambda function, publishing from boto3 sns client):
This created 2 new lines: message.replace('\n', '\r\n')
This created 1 new line: message.replace('\n', '\r')
Example:
message = message.replace('\n', '\r').replace('\t', ' ')
# Sending the notification...
snsclient.publish(
TargetArn=SNS_EMAIL_ALERTS_ARN,
Subject=f'{filter_name} Alert: ({lambda_func_name[3]})',
Message=message
)

Related

How to properly format JSON in Powershell while using aws-cli?

Error sending JSON structure using aws-cli in Powershell. Specifically a call to put an item into an existing DynamoDB table.
The problem seems to be that the lack of double quotes around keys and values in the JSON object I'm attempting to send. I've read that Powershell is finicky with outputting double quotes, especially when leveraging external APIs.
Unfortunately, since my org uses okta for authenticating AWS requests, I have to use Powershell.
I've tried everything that I've seen here:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertto-json?view=powershell-6
...here:
Error parsing parameter '--expression-attribute-values': Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)
...here:
https://github.com/aws/aws-cli/issues/1326
...and here:
PowerShell: best way to escape double quotes in string passed to an external program? E.g., a JSON string
WHAT I'VE TRIED:
This is the basic first attempt:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item '{"deviceId": {"S":"amzn1.ask.device.AEH2LHYGV7GSPP5THMR
5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK
634K4APEBRNVOKVZIDECOCBBIFB4"},"roomNumber": {"N":9110}}' --return-consumed-capacity TOTAL
Then I tried escaping with backslash:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item {\"deviceId\":{\"S\":\"amzn1.ask.device.AEH2LHYGV7GSPP5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK634K4APEBRNVOKVZIDECOCBBIFB4\"},\"roomNumber\": {\"N\":9110}} --return-consumed-capacity TOTAL
Then esacping with backtick (which i've replaced here with an asterisk so SO would read it as code) and backslash:
{*"deviceId*": {*"S*":*"amzn1.ask.device.AEH2LHYGV7GSPP
5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC
7CSQK634K4APEBRNVOKVZIDECOCBBIFB4*"},*"roomNumber*": {*"N*":9110}}" --return- consumed-capacity TOTAL
I then tried a "here string" to no avail.
EXPECTATIONS and RESULTS:
I would expect a method of escaping that's in the microsoft documentation to work.
Each of the above gave this error with a variation of the problematic "JSON received" based on the escape method, but it never had double quotes around keys and values:
Error parsing parameter '--item': Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
JSON received: {deviceId: {S:amzn1.ask.device.AEH2LHYGV7GSPP5THMR5H56AI2OOMAQ7MF54CZ3E6WR433WGS6QAOCYCKJWRJ3TQY5IE76NWR2IKCANB6TJNKLDEZOO2YN6ACUVT33MKSS4CO6R7GJI6GDFLOBOPUA2IXX7RI732UXJ6PDST5KYC7CSQK634K4APEBRNVOKVZIDECOCBBIFB4},roomNumber: {N:9110}}
The only thing that seemed to work was using "file://file.json" as the input to --item, which I couldn't find documented anywhere... I think it was on that github thread I linked. However, I'd rather not have to edit a file every time I want to send JSON with an AWS API call... Here it is:
okta-aws dh dynamodb put-item --table-name AlexaRoomLookup-dev --item file://file.json --return-consumed-capacity TOTAL
Can anyone provide info other than what's listed here as to why the above methods wouldn't work? Have I just implemented them incorrectly?
Thanks.
I was having the same issue while trying to send a message to Amazon SQS with JSON body using PowerShell. After trying different escape characters, the following worked for me.
aws sqs send-message --queue-url "<queue-url>" --message-body '{\""key1\"": \""value1\"",\""key2\"": \""value2\"",\""key3\"": \""value3\"" }'
OS: Windows 10 Pro (Version 1803)
AWS CLI version: 1.16.180
For further information, see the official documentation.
Using quotation marks with strings in the AWS CLI
You could try the PowerShell "stop parsing symbol" (i.e. "--%") at the start of the command. This tells PowerShell to use the rest of the parameters verbatim.
PS> okta-aws --% dh dynamodb put-item --table-name AlexaRoomLookup-dev --item '{"deviceId": {"S":"amzn1.ask.device.AEH...etc...FB4"},"roomNumber": {"N":9110}}' --return-consumed-capacity TOTAL
See about_parsing for more details...
It won't help if your json is in a variable, but if it's hard-coded like your example above it might work.

ARM.Template from bash-script. Unterminated string. Expected delimiter:

I am writing a bash-script for uploading certificate from a linux-server to azure keyvault using the "armclient"
I follow this guide on how to use the armclient:
https://blogs.msdn.microsoft.com/appserviceteam/2016/05/24/deploying-azure-web-app-certificate-through-key-vault/
The command i want to perform is this:
ARMClient.exe PUT /subscriptions/<Subscription Id>/resourceGroups/<Server Farm Resource Group>/providers/Microsoft.Web/certificates/<User Friendly Resource Name>?api-version=2016-03-01 "{'Location':'<Web App Location>','Properties':{'KeyVaultId':'<Key Vault Resource Id>', 'KeyVaultSecretName':'<Secret Name>', 'serverFarmId':'<Server Farm (App Service Plan) resource Id>'}}"
I have created a string that populates all the fields required:
putparm=$resolved_armapi" \"{'Location':'$resolved_locationid','Properties':{'KeyVaultId':'$resolved_keyvaultid','KeyVaultSecretName':'$certname','serverFarmId':'$resolved_farmid'}}"\"
When i echo the output of the variable putparm, the result looks as expected (X-ed out names/ids):
/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/certificates/XXXX-XXXXX-XXXXX?api-version=2016-03-01 "{'Location':'Central US','Properties':{'KeyVaultId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.KeyVault/vaults/XXXXXXXX','KeyVaultSecretName':'XXXX-XXXXX-XXXXX','serverFarmId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/serverfarms/ServicePlan59154b1c-XXXX'}}"
When i run armclient put $putparm in the script i get this error:
"error": {
"code": "InvalidRequestContent",
"message": "The request content was invalid and could not be deserialized: 'Unterminated string. Expected delimiter: \". Path '',
line 1, position 21.'." }
But when i take the output of the $putparm variable and run the command "manually" on the server, it works.
I guess its something with the way linux store the variables and that the API is requesting JSON (or something..)
Happy for any help.
The way you define your variable putparam is wrong.
It is likely interpreted as a literal string and not as an object. Note that a simple string, like "hello", is a valid JSON data, but it probably not what is expecting your server.
If you should quote your variable correctly:
putparm="{\"Location\":\"$resolved_locationid\",\"Properties\":{\"KeyVaultId\":\"$resolved_keyvaultid\",\"KeyVaultSecretName\":\"$certname\",\"serverFarmId\":\"$resolved_farmid\"}}"
and use it like this:
armclient put "$resolved_armapi" "$putparm"

How to import Google Maps API into PostgreSQL?

I am trying to transfer data from a JSON file produced by the Google Maps API onto my PostgreSQL database. This is done through cURL and I made sure that the permissions have been correctly set.
The url:
https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=London&destinations=Paris&key=AIza-[key-redacted]-3z6ho-o
The query:
copy bookings.import(info) from program 'C:/temp/mycurl/curl "https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=London&destinations=Paris&key=AIzaSyBIhOMI68hTIFarH4jrb_eKUmvY3z6ho-o" --insecure'
However, when I try to do this on my table with column 'info' of type 'json', I get the following error:
ERROR: invalid input syntax for type json DETAIL: The input string
ended unexpectedly. CONTEXT: JSON data, line 1: { COPY import, line
1, column info: "{"
********** Error **********
ERROR: invalid input syntax for type json SQL state: 22P02 Detail: The
input string ended unexpectedly. Context: JSON data, line 1: { COPY
import, line 1, column info: "{"
I am trying to not include things such as PHP or any other tool currently, yet if the only option is that I would certainly consider it.
What exactly do you guys think I am doing wrong? Is it the syntax, the format or am I missing something?
Thanks!
COPY assumes that each newline indicates a new record. Unfortunately, the Google Maps DistanceMatrix API is pretty-printing your response which means that it comes through as 23 rows, none of which are valid JSON.
You can get around this by piping the curl response through something like jq.
copy imports(info) from program 'curl "https://maps.googleapis.com/maps/api/distancematrix/json?units=imperial&origins=London&destinations=Paris&key=<my_key>" --insecure | /usr/local/bin/jq "." -c'
jq has lots of useful features if you want to massage the response a bit more before stashing it in the database.

Setting Jenkins build name from package.json version value

I want to include the value of the "version" parameter in package.json as part of the Jenkins build name.
I'm using the Jenkins Build Name Setter plugin - https://wiki.jenkins-ci.org/display/JENKINS/Build+Name+Setter+Plugin
So far I've tried to use PROPFILE syntax in the "Build name macro template" step:
${PROPFILE,file="./mainline/projectDirectory/package.json",property="\"version\""}
This successfully creates a build, but includes the quotes and comma surrounding the value of the version property in package.json, for example:
"0.0.1",
I want just the value inside returned, so it reads
0.0.1
How can I do this? Is there a different plugin that would work better for parsing package.json and getting it into the template, or should I resort to some sort of regex for removing the characters I don't want?
UPDATE:
I tried using token transforms based on reading the Token Macro Plugin documentation, but it's not working:
${PROPFILE%\"\,#\",file="./mainline/projectDirectory/package.json",property="\"version\""}
still just returns
However, using only one escaped character and only one of # or % works. No other combinations I tried work.
${PROPFILE%\,,file="./mainline/projectDirectory/package.json",property="\"version\""}
which returns "0.0.1" (comma removed)
${PROPFILE#\"%\"\,,file="./mainline/projectDirectory/package.json",property="\"version\""}
which returns "0.0.1", (no characters removed)
UPDATE:
Tried to use the new Jenkins Token Macro plugin's JSON macro with no luck.
Jenkins Build Name Setter set to update the build name with Macro:
${JSON,file="./mainline/pathToFiles/package.json",path="version"}-${P4_CHANGELIST}
Jenkins build logs for this job show:
10:57:55 Evaluated macro: 'Error processing tokens: Error while parsing action 'Text/ZeroOrMore/FirstOf/Token/DelimitedToken/DelimitedToken_Action3' at input position (line 1, pos 74):
10:57:55 ${JSON,file="./mainline/pathToFiles/package.json",path="version"}-334319
10:57:55 ^
10:57:55
10:57:55 java.io.IOException: Unable to serialize org.jenkinsci.plugins.tokenmacro.impl.JsonFileMacro$ReadJSON#2707de37'
I implemented a new macro JSON, which takes a file and a path (which is the key hierarchy in the JSON for the value you want) in token-macro-2.1. You can only use a single transform per macro usage.
Try the token transformations # and % (see Token-Makro-Plugin):
${PROPFILE#"%",file="./mainline/projectDirectory/package.json",property="\"version\""}
(This will only help if you are using pipelines. But for what it's worth,..)
What works for me is a combination of readJSON from the Pipeline Utility Steps plugin and directly setting currentBuild.displayName, thusly:
script {
// readJSON from "Pipeline Utility Steps"
def packageJson = readJSON file: 'package.json'
def version = packageJson.version
echo "Setting build version: ${packageJson.version}"
currentBuild.displayName = env.BUILD_NUMBER + " - " + packageJson.version
// currentBuild.description = "other cool stuff"
}
Omitting error handling etc obvs.

Why consecutive event jsons fall on the same line in some packages in githubarchive?

In http://www.githubarchive.org/ that Ilya Grigorik has provided ,I found that in many gz files , some consecutive events are logged to same file .
for example in 2011-03-15-21.json.gz
To get the above do :
wget http://data.githubarchive.org/2011-03-15-21.json.gz
In this gz for example if you search for id 1484832 , you can find that the 2 consecutive events(jsons) are in same line
see
http://codebeautify.org/jsonviewer/2cb891
the two jsons in same line is a combination of
http://codebeautify.org/jsonviewer/c7e18e
and
http://codebeautify.org/jsonviewer/945d56
.
What is the impact ?
when I was loading each line and loading it with python's(why python ? because I felt python is comfortable in dealing with jsons) json.loads it said it was invalid as it was a combination of two jsons .
Question :
1) How did you solve these kind of bugs when you processed that github archive data ?
2) I already have the data in my local . so how can I overcome this problem . Shall I write code specific to this case to overcome ?
the code i wrote was like
jsonlist = line.split('}{')
json.loads(jsonlist[0] + '}', "ISO-8859-1") # load and navigate through this json
json.loads('{' + jsonlist[1], "ISO-8859-1") # load and navigate through this json
I got the solution here
1) How did you solve these kind of bugs when you processed that github archive data ?
https://github.com/vadasg/githubarchive-parser/blob/master/src/FixGitHubArchiveDelimiters.rb
. This script removes the problems of two or more events appearing on the same line .
so now after running this script the jsons appear in different lines .
2) I already have the data in my local . so how can I overcome this problem . Shall I write code specific to this case to overcome ? the code i wrote was like
This script removes the necessity to write the code I mentioned above .
Note :
Related issues found on the github archive project in github
https://github.com/igrigorik/githubarchive.org/issues/53
https://github.com/igrigorik/githubarchive.org/issues/17
WARNING :
When I was running this script I got an error related to the encoding used . Because by default the Yajl::Parser.parse(jsonInputFile)
line checks if characters it parses adheres to UTF-8 encoding ,if not it will throw errors .
As github data also contains non UTF-8 characters , this error will be thrown in our case too. So to bypass that problem(or may be a fix) I put it as
Yajl::Parser.parse(jsonInputFile, :check_utf8 => false)
for doubts refer docs: http://rdoc.info/github/brianmario/yajl-ruby/Yajl/Parser.parse