I am trying to log the request body of requests to my api and nginx is turning all quotes (and some other characters like spaces and tabs) into hexadecimal characters.
Here is my log format
log_format postdata '{"ts": "$time_iso8601", "status": $status, "req": "$uri", "meth": "$request_method", "body": "$request_body"}';
Here is what gets logged
{"ts": "2015-05-20T15:31:11-07:00", "status": 400, "req": "/v2/track", "meth": "POST", "body": {\x22id\x22:\x22user id\x22}}
How can I prevent this so that the resulting log line is
{"ts": "2015-05-20T15:31:11-07:00", "status": 400, "req": "/v2/track", "meth": "POST", "body": {"id":"user id"}}
Sinse 1.13 there is an "escape=none" parameter that turns off data escaping.
http://nginx.org/en/docs/http/ngx_http_log_module.html#log_format
log_format api_request_log escape=none '[$time_local] $request \n$request_body';
You can't stop from escaping it and will have to post process it.
Python2 example:
line = '{\x22id\x22:\x22user id\x22}'
line.decode('unicode_escape')
>> u'{"id":"user id"}'
Python3 example:
line = '{\x22id\x22:\x22user id\x22}'
bytes(line, 'utf-8').decode('unicode_escape')
>> '{"id":"user id"}'
Ruby example (from https://stackoverflow.com/a/18752208/2398354):
require 'yaml'
line = '{\x22id\x22:\x22user id\x22}'
YAML.load(%Q(---\n"#{line}"\n))
=> "{\"id\":\"user id\"}"
Note: This last example is useful if post processing a file with logstash
Hope this will be helpful for someone.
In order to log entire json request unescaped.
Do add in http block this configuration
http {
log_format postdata escape=json $request_body;
access_log /var/log/nginx/access.log postdata;
.....
}
Like others said, there is no way to fix this within nginx configuration.
But it is not difficult to post-process it. If the request body is JSON-formatted, you'll probably run into a lot of \x0A (newline) and \x22 (").
Just clear those out with sed before looking into the logfile.
Here is the command for you: LANG='' sed -E 's/(\\x0A|\\x22)//g' access.log
Related
I am trying to submit a spark job where I am setting a date argument in conf property and I am running it through a script in NiFi. However, when I am running the script I am facing an error.
Spark Submit Code in the script:
aws emr add-steps --cluster-id "$1" --steps '[{"Args":["spark-submit","--deploy-mode","cluster","--jars","s3://tvsc-lumiq-edl/jars/ojdbc7.jar","--executor-memory","10g","--driver-memory","10g","--conf","spark.hadoop.yarn.timeline-service.enabled=false","--conf","currDate='\"$5\"'","--class",'\"$2\"','\"$3\"','\"$4\"'],"Type":"CUSTOM_JAR","ActionOnFailure":"CONTINUE","Jar":"command-runner.jar","Properties":"","Name":"Spark application"}]' --region "$6"
and after I run it, I get the below error:
ExecuteStreamCommand[id=5b08df5a-1f24-3958-30ca-2e27a6c4becf] Transferring flow file StandardFlowFileRecord[uuid=00f844ee-dbea-42a3-aba3-0edcabfc50a2,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1607082757752-507103, container=default, section=223], offset=29, length=-1],offset=0,name=6414901712887990,size=0] to nonzero status. Executable command /bin/bash ended in an error:
Error parsing parameter '--steps': Invalid JSON:
[{"Args":["spark-submit","--deploy-mode","cluster","--jars","s3://tvsc-lumiq-edl/jars/ojdbc7.jar","--executor-memory","10g","--driver-memory","10g","--conf","spark.hadoop.yarn.timeline-service.enabled=false","--conf","currDate="Fri
Where am I going wrong?
You can use JSONLint to validate your JSON, which makes it easier to see why its wrong.
In your case, you are wrapping the final 3 values in single quotes ' rather than double quotes "
Your steps JSON should look like:
[{
"Args": [
"spark-submit",
"--deploy-mode",
"cluster",
"--jars",
"s3://tvsc-lumiq-edl/jars/ojdbc7.jar",
"--executor-memory",
"10g",
"--driver-memory",
"10g",
"--conf",
"spark.hadoop.yarn.timeline-service.enabled=false",
"--conf",
"currDate='\"$5\"'",
"--class",
"\"$2\"",
"\"$3\"",
"\"$4\""
],
"Type": "CUSTOM_JAR",
"ActionOnFailure": "CONTINUE",
"Jar": "command-runner.jar",
"Properties": "",
"Name": "Spark application"
}]
Specifically, these 3 lines:
"\"$2\"",
"\"$3\"",
"\"$4\""
Instead of the original:
'\"$2\"',
'\"$3\"',
'\"$4\"'
I am trying to post a json string using curl in scala. My curl command works fine if executed from linux box but throes an error(("message": "Must provide query string.",) always from scala.
my working curl command in linux:
curl http://laptpad1811:5000/graphql -H "Content-Type: application/json"
-X POST -d '{"query":"mutation
CreateFileReceivedEvent($createFileReceivedEventInput:
CreateFleReceivedEventInput!) { createFileReceivedEvent(input:
$createFileReceivedEventInput) { clientMutationId }}","variables":
{"createFileReceivedEventInput":
{"clientMutationId":"Test","fileReceivedEvent":{"file":
{"fileTrackingId":"83a86c44-66a5-4de0-9b7f-
c6995877279d","name":"textfile_2017-08-21T15:58:45Z","fileType":
{"code":"textfile"}},"eventTimestamp":"2017-08-
21T15:59:30Z"}}},"operationName":"CreateFileReceivedEvent"}'
My scala code:
step1: copying the entire json string(pay load) to txt file
'{"query":"mutation CreateFileReceivedEvent($createFileReceivedEventInput:
CreateFleReceivedEventInput!) { createFileReceivedEvent(input:
$createFileReceivedEventInput) { clientMutationId }}","variables":
{"createFileReceivedEventInput":
{"clientMutationId":"Test","fileReceivedEvent":{"file":
{"fileTrackingId":"83a86c44-66a5-4de0-9b7f-
c6995877279d","name":"textfile_2017-08-21T15:58:45Z","fileType":
{"code":"textfile"}},"eventTimestamp":"2017-08-
21T15:59:30Z"}}},"operationName":"CreateFileReceivedEvent"}'
step2:
val data=fromFile("/usr/test/data.txt").getLines.mkString
step3:
val cmd = Seq("curl", "http://laptpad1811:5000/graphql", "-H",
"'Content-Type:application/json'" ,"-X", "POST", "-d" , data)
step4:
cmd.!!
I get the below error
String =
"{
"errors": [
{
"message": "Must provide query string.",
"stack": "BadRequestError: Must provide query string.\n
I have tried to change " to ' and mutiple combinations of the json string but I always get the same error.
I suspect that your issue is that sys.process doesn't pass commands through the shell (e.g. bash), so quotes that are necessary in the shell become unnecessary in Scala (and get passed through to the command which in the case of Unix-style utilities will probably result in unexpected behavior).
So try:
val cmd = Seq("curl", "http://laptpad1811:5000/graphql", "-H", "Content-Type: application/json", "-X", "POST", "-d", data)
Likewise remove the single quote wrapping from your text file.
I would however, counsel against spawning a curl in Scala and advise using one of the existing http client libraries (I personally like Gigahorse).
I am trying to use RabbitMQ HTTP REST client to publish messages into the queue. I am using the following url and request
http://xxxx/api/exchanges/xxxx/exc.notif/publish
{
"routing_key":"routing.key",
"payload":{
},
"payload_encoding":"string",
"properties":{
"headers":{
"notif_d":"TEST",
"notif_k": ["example1", "example2"],
"userModTime":"timestamp"
}
}
}
And getting back from the rabbit the following response:
{"error":"bad_request","reason":"payload_not_string"}
I have just one header set:
Content-Type:application/json
I was trying to set the
"payload_encoding":"base64",
but it didn't help. I am new to rabbit any response is welcome.
Try with
{
"properties": {
"content-type": "application/json"
},
"routing_key": "testKey",
"payload": "1234",
"payload_encoding": "string"
}
Working example. We need simple to escape doublequotes.
It is important that the colon is outside of the quotes, as this causes inexplicable errors.
{
"properties": {},
"routing_key": "q_testing",
"payload": "{
\"message\": \"message from terminal\"
}",
"payload_encoding": "string"
}
I managed to send content-type using underscore "_" instead of dash.
See here for list of valid properties.
See RabbitMQ Management HTTP API for some examples.
To publish a json message using curl to rabbit exchange:
curl -i -u guest:guest -XPOST --data '{"properties":\
{"content_type":"application/json"}, \
"routing_key":"", \
"payload":"{\"foo\":\"bar\"}",\
"payload_encoding":"string"}' \
"http://localhost:15672/api/exchanges/%2f/exchange_name/publish"
content_type is written using underscore, routing_key is empty to send a message to exchange, not to particular queue.
To use a JSON formatted payload you have to encode it in base64 and use the "payload_encoding": "base64" attribute.
I am using a Groovy script to send a POST using the Slack API, at present I am getting invalid_payload returned and I think this is most likely due to the formatting of my JSON. I know the Slack API expects it JSON with double quotes but I can't seem to be able to pass a variable into the JSON object:
SUCCESS_MESSAGE = '{"attachments": [{"color": "#2A9B3A", "author_name": ${DEV_NAME}, "title": "Build Status", "title_link": ${BUILD_URL}, "text": "Successful Build" }]}'
def response = ["curl", "-X", "POST", "-H", "Content-Type: application/json", "-d", "${SUCCESS_MESSAGE}", "https://hooks.slack.com/services/${SLACK_WEBHOOK}"].execute().text
How should I correctly format my SUCCESS_MESSAGE var so I don't get the error?
You need to quote your DEV_NAME and BUILD_URL variable expansions so the JSON string is valid.
Your whole string needs to be enclosed in " instead of ' so the variables are actually expanded
And you need to escape the " inside your string so they appear in your JSON string.
SUCCESS_MESSAGE = "{\"attachments\": [{\"color\": \"#2A9B3A\", \"author_name\": \"${DEV_NAME}\", \"title\": \"Build Status\", \"title_link\": \"${BUILD_URL}\", \"text\": \"Successful Build\" }]}"`
Alternatively you can generate the JSON in much nicer programmatic way. Which would be helpful if your notifications got a bit more complicated:
def notification = [
attachments: [
[
color: "#2A9B3A",
author_name: DEV_NAME,
title: "Build Status",
title_link: BUILD_URL,
text: "Successful Build"
]
]
]
def response = ["curl", "-X", "POST", "-H", "Content-Type: application/json", "-d", JsonOutput.toJson(notification), "https://hooks.slack.com/services/${SLACK_WEBHOOK}"].execute().text
I'm using curl to send JSON to an API endpoint. However, somewhere in the bash chain it is getting messed up.
Is there something special to know about encoding with curl?
If I construct the payload like this:
PAYLOAD='payload={"channel": "github", "username": "webhookbot", "icon_emoji": ":ghost:", "text": "'
PAYLOAD+=$1
PAYLOAD+=' " }'
echo $PAYLOAD
curl -X POST --data-urlencode "$PAYLOAD" $SLACKPOSTURL
echo "sent"
I'll get back an error
Payload was not valid JSONsent
however if i just hardwire to assign a variable with the output
PAYLOAD='payload={"channel": "github", "username": "webhookbot", "icon_emoji": ":ghost:", "text": "LAST_COMMIT Merge pull request #558 from dcsan/boteditor Boteditor " }'
then it will go through fine.
Is there something that a simple assignment is doing differently vs. concatenating strings? In the console the output looks identical.
FWIW some messages go through but content like this:
LAST_COMMIT Merge pull request #558 from dcsan/boteditor Boteditor
will only go through if hardcoded in. so its not the other end afaican see, its something to do with the way messages are built.
I guess you want to concatenate values into your variable. But += is not the way to do so.
To concatenate strings in a variable you need to say:
PAYLOAD="$PAYLOAD $1"
All together it would be something like the following. Note the need to use " so that the variable $PAYLOAD is expanded and the usage of \" to store a literal double quote:
PAYLOAD='payload={"channel": "github", "username": "webhookbot", "icon_emoji": ":ghost:", "text": "'
PAYLOAD="$PAYLOAD $1 \" }"
echo "$PAYLOAD"
curl -X POST --data-urlencode "$PAYLOAD" $SLACKPOSTURL
echo "sent"
This is what worked from me from a bash script:
curl -X POST --data-urlencode "payload={\"text\": \"$2\"}" https://hooks.slack.com/services/$KEY
Notice the inner quotes are escaped, but the outer quotes are not.
FYI, adding:
set -x
at the beginning of a bash script will show you the actual commands being executed, and save a lot of guesswork.