I'm trying to build a step function which receives json as input and then uses only some of this JSON in a message sent via SNS in the step function. I've tried using some of the instrinsic json manipulation functions available but with no luck.
Is there a way to extract specific json fields for an SNS message without using a lambda?
For the message field, I would like the it to be:
message.$: $.name $.questions etc...
But this doesn't work
Here is my code:
stepFunctions:
stateMachines:
hellostepfunc1:
name: test
definition:
Comment: "test"
StartAt: SNSState
States:
SNSState:
Type: Task
InputPath: $
Resource: arn:aws:states:::sns:publish
Parameters:
TopicArn:
Fn::GetAtt: [ MyTopic, TopicArn ]
Message.$: $ //here I would like to send multiple e.g $.name $questions
End: true
The best way is to use an AWS Lambda function. That is, develop a custom AWS Lambda function that can read and manipulate JSON to meet your business requirments using a JSON library. Then hook these Lambda functions into an Amazon States Language document.
Related
Is is there known science for getting JSON data logged via Cloud Watch imported into an Elasticsearch instance as well structured JSON?
That is -- I'm logging JSON data during the execution of an Amazon Lambda function.
This data is available via Amazon's Cloud Watch service.
I've been able to import this data into an elastic search instance using functionbeat, but the data comes in as an unstructured message.
"_source" : {
"#timestamp" : "xxx",
"owner" : "xxx",
"message_type" : "DATA_MESSAGE",
"cloud" : {
"provider" : "aws"
},
"message" : ""xxx xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx INFO {
foo: true,
duration_us: 19418,
bar: 'BAZ',
duration_ms: 19
}
""",
What I'm trying to do is get a document indexed into elastic that has a foo field, duration_us field, bar field, etc. Instead of one that has a plain text message field.
It seems like there are a few different ways to do this, but I'm wondering if there's a well trod path for this sort of thing using elastic's default tooling, or if I'm doomed to one more one-off hack.
Functionbeat is a good starting point and will allow you to keep it as "serverless" as possible.
To process the JSON, you can use the decode_json_fields processor.
The problem is that your message isn't really JSON though. Possible solutions I could think of:
A dissect processor that extracts the JSON message to pass it on to the decode_json_fields — both in the Functionbeat. I'm wondering if trim_chars couldn't be abused for that — trim any possible characters except for curly braces.
If that is not enough, you could do all the processing in Elasticsearch's Ingest pipeline where you probably stitch this together with a Grok processor and then the JSON processor.
Only log a JSON message if you can to make your life simpler; potentially move the log level into the JSON structure.
I have a CloudFormation template that consists of a Lambda function that reads messages from the SQS Queue.
Lambda function will read the message from the queue and transform it using a JSON template(Which I want it to be injected externally)
I will deploy different stacks for different products and for each product I will provide different JSON templates to be used for transformation.
I have different options but couldn't decide which one is better;
I can write all JSON files under the project and pack them together and pass related JSON name as a parameter to lambda.
I can store JSON files on S3 and pass S3 URL to lambda so I can read on runtime.
I can store JSON files on Dynamo DB and read from there using the same approach with 2
The first one seems like a better approach as I don't need to read from an external file on every lambda execution. But I will need to pack all templates together.
The last two are a more clear approach but require an external call to read JSON for every call.
Another approach could be (I'm not sure if it is possible) to inject a JSON file to Lambda on deploy from S3 bucket or sth. And Lambda function will read it like an environment variable.
As you can see from the cloudformation documentation Lambda environment variables can be only a Map of Strings, so the actual value you can pass to the function as an environment variable must be a String. You could pass your JSON as a string but the problem is that the max size for all environment variables is 4 KB.
If your templates are bigger and you don't want to call S3 or DynamoDB at runtime you could do a workaround like writing a simple shell script that copies the correct template file to the lambda folder before building and deploying the stack. This way the lambda gets deployed in a package with the code and only the desired json template.
I decided to go with S3 setup and also improved efficiency by storing Json on a global variable (after reading the first time). So I read once and use it for the lifetime of the Lambda container.
I'm not sure this is the best solution but works well enough for my scenario.
We are trying to pass JSON data from TFS to AWS lambda function (myLambdaFunc). For this we are using the 'Invoke Lambda function' utility in the release definition. There is an option to pass the a payload in form of JSON from that utility in TFS.
The lambda function (myLambdaFunc) is capable to read JSON passed to it and create an entry in dynamo DB. And it is working fine if we run the function from AWS console.
But if we run the TFS release job, it is saying that the lambda function has been executed successfully, but no entry in dynamo DB is getting created. Seems like the payload/JSON is not being properly passed to the lambda function.
We are using online TFS:
https://xxxxxxxxxx.visualstudio.com/_projects
Please help!!
I had to parse the event in order to access the TFS payload information:
var data = JSON.parse(event);
console.log('Message:', data['KeyX']);
Trying to figure out how I can access elements of a post request body (JSON) and store it as a variable. One of my tests creates a user using ${__UUID}#gmail.com - and I'd like to then check that my response includes this same information.
I'm guessing I could probably create the UUID before the request and store it as a variable, and then check against that, but wondering if there is anything similar to JSON Path Extractor for request elements.
There is a JSR223 PreProcessor you can use to fulfil your requirement.
Assuming you have JSON Payload like:
{
"user": "${__UUID}#gmail.com"
}
Add JSR223 PostProcessor and put the following code into "Script" area:
def user = com.jayway.jsonpath.JsonPath.read(sampler.getArguments().getArgument(0).getValue(), '$..user').get(0).toString()
log.info('Random user email:' + user)
vars.put('user', user)
The above code will:
Extract from the request everything which matches $..user JSON Path expression
Print it to jmeter.log file
Store the value into a JMeter Variable so you will be able to refer it as ${user} where required.
More information:
Apache Groovy - Why and How You Should Use It
Groovy - Parsing and Producing JSON
I have an inbound payload in JSON format. I'm converting it using the "JSON to Object" converter, and then passing on the data to a component (as a JsonData object.) My component then returns the same JsonData object with modifications. I'm trying to use the Amazon S3 component as the next step in my flow, and trying to tie the bucket name and other values to elements accessible in the JsonData object.
Here is the expression for the bucket name for instance:
#[json: TopKey/BucketName]
From experience this has worked with JSON.
However when I run this, here is what I get:
Message : Failed to invoke getObjectContent. Message payload is of type: JsonData
Code : MULE_ERROR-29999
Failed to invoke getObjectContent. Message payload is of type: JsonData (org.mule.api.MessagingException)
org.mule.module.s3.processors.GetObjectContentMessageProcessor:177 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/api/MessagingException.html)
Is there a way I can use my JsonData object and pull information from it, or do I have to convert it back to something else before passing it on to the Amazon S3 component?
Thanks,
After trying a little more to play with my expression, I figured out I can just access elements the way I do it in my Java component already:
#[payload.get("TopKey").get("BucketName").getTextValue()]
and I have my BucketName!
Remove the empty space from your expression: #[json:TopKey/BucketName]
You can set the "Return Class" to java.util.Map in the "JSON to Object" processor, you can then access the value via #[payload.TopKey.BucketName]