I have the following resource on my CloudFormation template to create a rule to run a Lambda function, from the AWS documentation:
"ScheduledRule": {
"Type": "AWS::Events::Rule",
"Properties": {
"Description": "ScheduledRule",
"ScheduleExpression": "rate(5 minutes)",
"State": "ENABLED",
"Targets": [{
"Arn": { "Fn::GetAtt": ["myLambda", "Arn"] },
"Id": "TargetFunctionV1"
}]
}
}
I would like to specify the Input:
{
"Arn" : String,
"Id" : String,
"Input" : String,
"InputPath" : String
}
and Input is a JSON-formatted text string that is passed to the target. This value overrides the matched event.
I would like my JSON formatted text to be:
{
"mykey1": "Some Value"
}
I do not know how to specify it in the Input, when I put:
"ScheduledRule": {
"Type": "AWS::Events::Rule",
"Properties": {
"Description": "ScheduledRule",
"ScheduleExpression": "rate(5 minutes)",
"State": "ENABLED",
"Targets": [{
"Arn": { "Fn::GetAtt": ["myLambda", "Arn"] },
"Id": "TargetFunctionV1",
"Input": { "mykey1": "Some Value" }
}]
}
}
I will get error:
Value of property Input must be of type String
How should I specify it correctly?
I would use YAML as it is easier and more readable:
Input:
!Sub |
{
mykey1: "${myKey}"
}
Found out the answer myself:
"Input": "{ \"test\" : \"value11\", \"test2\" : \"value22\"}"
Hope it helps someone else.
Update:
You basically use the result of JSON.Stringify() to get the string into "Input" field. Use online JSON.Stringify() like https://onlinetexttools.com/json-stringify-text
I wanted to expand on #Pau's answer. Basically if you use the | to say that the whole block below is to be treated as a raw string.
If you need to replace any variable in the JSON then you can use Sub, but if you don't have any variables, then you don't need Sub. An example would be:
Input:|
{
"jsonVar":"jsonVal",
"jsonVar2" : "jsonVal2"
}
You can later do JSON.parse(<input-variable>) to get the JSON object.
NOTE: Don't put commas at the end of the variables in JSON, if there is no next variable. Example :
Input:|
{
"jsonVar":"jsonVal",
"jsonVar2" : "jsonVal2",
}
This will cause JSON parsing errors.
If you are writing your CloudFormation scripts in yaml and finding it difficult to use a JSON string (such as a policy doc) the easiest way is to convert your JSON into yaml using an online converter
ApiGatewayRestApi:
Type: AWS::ApiGateway::RestApi
Properties:
Description: API Gateway for some API
EndpointConfiguration:
Types:
- PRIVATE
Name: MyAPIGateway
Policy: <Policy Doc >
Lets say the policy doc is as follows.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": "execute-api:Invoke",
"Resource": "arn:aws:execute-api:ap-southeast-2:something/*",
"Condition": {
"ForAnyValue:StringEquals": {
"aws:sourceVpce": "vpce-abcd"
}
}
}
]
}
Using the converter you can convert the JSON into an identical yaml and use as follows.
ApiGatewayRestApi:
Type: AWS::ApiGateway::RestApi
Properties:
Description: API Gateway for some API
EndpointConfiguration:
Types:
- PRIVATE
Name: MyAPIGateway
Policy:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal: "*"
Action: execute-api:Invoke
Resource: arn:aws:execute-api:ap-southeast-2:something/*
Condition:
ForAnyValue:StringEquals:
aws:sourceVpce: vpce-abcd
Here's my similar YAML code for the "input" line. It took me all day to figure out the syntax so I'm hoping this might help someone in the future.
Input: "{\"action\":[\"configure\"],\"mode\":[\"ec2\"],\"optionalConfigurationSource\":[\"ssm\"],\"optionalConfigurationLocation\":[\"AmazonCloudWatch-Baseline-Windows\"],\"optionalRestart\":[\"yes\"]}"
Related
I am able to get a single JSON object in Kibana:
By having this in the filebeat.yml file:
output.elasticsearch:
hosts: ["localhost:9200"]
How can I get the individual elements in the JSON string. So say if I wanted to compare all the "pseudorange" fields of all my JSON objects. How would I:
Select "pseudorange" field from all my JSON messages to compare them.
Compare them visually in kibana. At the moment I can't even find the message let alone the individual fields in the visualisation tab...
I have heard of people using logstash to parse the string somehow but is there no way of doing this simply with filebeat? If there isn't then what do I do with logstash to help filter the individual fields in the json instead of have my message just one big json string that I cannot interact with?
I get the following output from output.console, note I am putting some information in <> to hide it:
"#timestamp": "2021-03-23T09:37:21.941Z",
"#metadata": {
"beat": "filebeat",
"type": "doc",
"version": "6.8.14",
"truncated": false
},
"message": "{\n\t\"Signal_data\" : \n\t{\n\t\t\"antenna type:\" : \"GPS\",\n\t\t\"frequency type:\" : \"GPS\",\n\t\t\"position x:\" : 0.0,\n\t\t\"position y:\" : 0.0,\n\t\t\"position z:\" : 0.0,\n\t\t\"pseudorange:\" : 20280317.359730639,\n\t\t\"pseudorange_error:\" : 0.0,\n\t\t\"pseudorange_rate:\" : -152.02620448094211,\n\t\t\"svid\" : 18\n\t}\n}\u0000",
"source": <ip address>,
"log": {
"source": {
"address": <ip address>
}
},
"input": {
"type": "udp"
},
"prospector": {
"type": "udp"
},
"beat": {
"name": <ip address>,
"hostname": "ip-<ip address>",
"version": "6.8.14"
},
"host": {
"name": "ip-<ip address>",
"os": {
<ubuntu info>
},
"id": <id>,
"containerized": false,
"architecture": "x86_64"
},
"meta": {
"cloud": {
<cloud info>
}
}
}
In Filebeat, you can leverage the decode_json_fields processor in order to decode a JSON string and add the decoded fields into the root obejct:
processors:
- decode_json_fields:
fields: ["message"]
process_array: false
max_depth: 2
target: ""
overwrite_keys: true
add_error_key: false
Credit to Val for this. His answer worked however as he suggested my JSON string had a \000 at the end which stops it being JSON and prevented the decode_json_fields processor from working as it should...
Upgrading to version 7.12 of Filebeat (also ensure version 7.12 of Elasticsearch and Kibana because mismatched versions between them can cause issues) allows us to use the script processor: https://www.elastic.co/guide/en/beats/filebeat/current/processor-script.html.
Credit to Val here again, this script removed the null terminator:
- script:
lang: javascript
id: trim
source: >
function process(event) {
event.Put("message", event.Get("message").trim());
}
After the null terminator was removed the decode_json_fields processor did its job as Val suggested and I was able to extract the individual elements of the JSON field which allowed Kibana visualisation to look at the elements I wanted!
suppose you have a Map<String, Object> called "something" in YAML
something:
and the corresponding JSON should look like this:
json
"something": {
"else": "then",
"array": [
"element in array"
]
}
so for this yaml spec might be:
something:
else: then
array:
- element in array
but since something is a Map it does not let me do
array:
- element in array
or this
array: ['element in array']
so the question is what should be the yaml to get the above mentioned JSON considering something is a Map<String, Object> is it possible?
This is regarding the defining of the ServiceCatalogDefinition for the implementation of OpenServiceBroker API.
OSB Catalog using Yaml
OSB Catalog json looks like this
I am trying to make the "properties" mentioned in schemas in the above link as required.
for that I need to make it return the json like this:
"properties" : {
"someProperty" : {
"description": "description",
"type": "string"
},
"required": [
"someProperty"
]
}
And the yaml does validation in my application.yml throwing the error mentioned in comment
There is two things you need to do:
make the JSON valid, e.g. by inserting a comma (as #flyx suggests) and adding curly braces around the root level object:
{
"something": {
"else": "then",
"array": [
"element in array"
]
}
}
change the plain scalar (i.e. without quotes) mapping key something, to a double quoted scalar:
{
"something": {
"else": "then",
"array": [
"element in array"
]
}
}
Since YAML has, for all practical purposes, effectively been a superset of JSON (since YAML 1.2 from 2009), you don't need to do anything else. And of course you can read the above with both a YAML loader, as well as with a JSON parser.
Using the site json2yaml, you get YAML :
---
something:
else: then
array:
- element in array
from the json :
{
"something": {
"else": "then",
"array": [
"element in array"
]
}
}
Compare to you, I think it's your "-" must be to the same level as "array".
We are sending messages to a service bus using a logic app. These messages will later be consumed by another service, the service expects the message content to be a string - essentially a stringified JSON object, with escape characters.
We are not able to find a method to stringify a JSON object in Logic Apps. Even if we explicitly provide a escaped string the logic app itself detects that it's stringified JSON and unescapes it and then sends it as a JSON object. We don't want that, we simply want it to send the string as it is. We have already tried changing the content type to text/plain, it does not work. The logic app always sends the unescaped string as JSON.
This post on MSDN: https://social.msdn.microsoft.com/Forums/office/en-US/e5dee958-09a7-4784-b1bf-facdd6b8a568/post-json-from-logic-app-how-to-escape-data?forum=azurelogicapps is of no help because doing this will violate the request contract of the message consuming service
Do you need the stringified message to include opening and closing double quotes?
I've tried this and it worked for me.
I have my JSON object as an output of a compose
Then, I initialised a variable with the Base64 encoded value of the escaped stringified JSON (you need to add ALL the proper escaping required,
mine was just a PoC)
Then, you send the variable already in Base64 to Service Bus. (You need to remove the encoding on that action).
"actions": {
"Compose_JSON_Object": {
"inputs": {
"message": "I want this as a string"
},
"runAfter": {},
"type": "Compose"
},
"Initialise_Variable_with_Stringified_JSON_Base64_Encoded": {
"inputs": {
"variables": [
{
"name": "jsonAsStringBase64",
"type": "String",
"value": "#base64(concat('\"', replace(string(outputs('Compose_JSON_Object')), '\"', '\\\"'), '\"'))"
}
]
},
"runAfter": {
"Compose_JSON_Object": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Send_message": {
"inputs": {
"body": {
"ContentData": "#variables('jsonAsStringBase64')",
"ContentType": "text/plain"
},
"host": {
"connection": {
"name": "#parameters('$connections')['servicebus']['connectionId']"
}
},
"method": "post",
"path": "/#{encodeURIComponent(encodeURIComponent('temp'))}/messages",
"queries": {
"systemProperties": "None"
}
},
"runAfter": {
"Initialise_Variable_with_Stringified_JSON_Base64_Encoded": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
This way, I got the message stringified.
HTH
i am building a conversation on the watson conversaton, and in a point, togeter with my "response" json, i woul also link to set a new intent for the user, i tried to add this to the json, but with no result.
There is a way to do this?
As you can see in the Official documentation, you can use context variables for save values.
A context variable is a variable that you define in a node, and
optionally specify a default value for. Other nodes or application
logic can subsequently set or change the value of the context
variable.
So, in this case, you'll create in your JSON advance (like your example) something like:
{
"context": {
"intent": "fgts",
"confidence": 1
},
"output": {
"text": {
"values": [
"Your text here"
],
"selection_policy": "sequential"
}
}
}
And in your back-end application, you can access the value in the response JSON object from your POST /message, with something like: response.context.intent and response.context.confidence
Obs.: By default, Watson Conversation service will return the name of the intent that Watson recognizes and the confidence level.
If really after these instructions you want to use your method. You can see my example below:
{
"output": {
"text": {
"values": [
"text here"
],
"selection_policy": "sequential"
},
"intents": "test"
}
}
And your app return:
{ intents: [ { intent: 'helpBot', confidence: 0.5930036529133407 } ],
entities: [],
input: { text: 'ajuda' },
output:
{ text: [ 'text here' ],
nodes_visited: [ 'node_16_1511443279233' ],
intents: 'test',
log_messages: [] },
context:
{ conversation_id: '83d88b05-7c76-457d-bd5f-7820be455a3e',
system:
{ dialog_stack: [Object],
dialog_turn_counter: 2,
dialog_request_counter: 2,
_node_output_map: [Object],
branch_exited: true,
branch_exited_reason: 'fallback' } } }
See more about accessing values using Conversation Service.
I am confused why the extractVariables rule I am using is returning data as it is. See Below.
The json to Parse is:
{
"callNotificationSubscriptionList": {
"playAndCollectInteractionSubscription": [],
"recognitionInteractionSubscription": [],
"playAndRecordInteractionSubscription": [],
"callDirectionSubscription": [],
"callEventSubscription": [
{
"clientCorrelator": "112345",
"resourceURL": "http:someurl",
"callbackReference": {
"notifyURL": "someotherurlt",
"notificationFormat": "XML"
},
"filter": {
"data1": "data abc",
"data2": "data def",
"data3": "data xyz"
}
}
]
}
}
The rule:
<JSONPayload>
<Variable name="callNotSubL">
<!-- <JSONPath>$.callNotificationSubscriptionList</JSONPath> -->
<JSONPath>$.*</JSONPath>
</Variable>
</JSONpayload>
When I use the value that is commented out, I get no response variable data. If I set the "ignoreUnresolvedVariables" parm to "false", I am returned a failure, so it has no data. Thus, I tried "$.*" With this, I am returned:
[
{
"callbackReference": {
"notifyURL": "someotherurlt",
"notificationFormat": "XML"
},
"filter": {
"data1": "data abc",
"data2": "data def",
"data3": "data xyz"
}
}
]
could this be because the EntryNames are so long? I admit they are long, but they are well under the default values in the JSON Threat Potection Policy.
I did pump this json though a web based JSONPayload parser and $.callNotificationSubscriptionList worked fine as did $.callNotificationSubscriptionList.callEventSubscription[0] which is what I am really after. But, if I can't get the top level right, I can't get the sub-levels at all.
I solved this issue using #Santanu's comments:
It seems that the JSON to parse getting through the policy is not same as what you are expecting? Can you try to assign entire payload to a variable using the AssignVariable policy before the JSON path extraction policy, and check the value of that variable in the debug view? That would help understand what payload value is actually passing through when you are trying to apply the json path extraction policy.
the <Source> tag was "request" and this was a response extract. I removed the <Source> tag and all is well