We have a setup where a program logs to a .Json file, in a format that follows the GELF specification.
Currently this is sent to a Graylog2 server using HTTP. This works, but due to the nature of HTTP there's a significant latency, which is an issue if there is a large amount of log messages.
I want to change the HTTP delivery method to UDP, in order to just 'fire and forget'.
The logs are written to files like this:
{ "short_message": "<message>", "host": "<host>", "full_message": "<message>", "_extraField1": "<value>", "_extraField2": "<value>", "_extraField3": "<value>" }
Current configuration is this:
<Extension json>
Module xm_json
</Extension>
<Input jsonLogs>
Module im_file
File '<File Location>'
PollInterval 5
SavePos True
ReadFromLast True
Recursive False
RenameCheck False
CloseWhenIdle True
</Input>
<Output udp>
Module om_udp
Host <IP>
Port <Port>
OutputType GELF_UDP
</Output>
With this setup, part of json log message is added to the "message" field of a GELF message, and sent to the server.
I've tried adding the line `Exec parse_json(), but this will simply result in all fields other than short_message and full_message being excluded.
I'm unsure how to configure this correctly. Even just having the complete log message added to a field is preferable, since I can add an extractor on the server side.
You'd need Exec parse_json() in order for GELF_UDP to generate proper output but it was unclear what the exact issue is with message and full/short_message.
Another option you could try is simply ship the log via om_tcp. In this case you'll not need to use OutputType GELF_TCP since it is already formatted that way.
Related
I am writing a bash-script for uploading certificate from a linux-server to azure keyvault using the "armclient"
I follow this guide on how to use the armclient:
https://blogs.msdn.microsoft.com/appserviceteam/2016/05/24/deploying-azure-web-app-certificate-through-key-vault/
The command i want to perform is this:
ARMClient.exe PUT /subscriptions/<Subscription Id>/resourceGroups/<Server Farm Resource Group>/providers/Microsoft.Web/certificates/<User Friendly Resource Name>?api-version=2016-03-01 "{'Location':'<Web App Location>','Properties':{'KeyVaultId':'<Key Vault Resource Id>', 'KeyVaultSecretName':'<Secret Name>', 'serverFarmId':'<Server Farm (App Service Plan) resource Id>'}}"
I have created a string that populates all the fields required:
putparm=$resolved_armapi" \"{'Location':'$resolved_locationid','Properties':{'KeyVaultId':'$resolved_keyvaultid','KeyVaultSecretName':'$certname','serverFarmId':'$resolved_farmid'}}"\"
When i echo the output of the variable putparm, the result looks as expected (X-ed out names/ids):
/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/certificates/XXXX-XXXXX-XXXXX?api-version=2016-03-01 "{'Location':'Central US','Properties':{'KeyVaultId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.KeyVault/vaults/XXXXXXXX','KeyVaultSecretName':'XXXX-XXXXX-XXXXX','serverFarmId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/serverfarms/ServicePlan59154b1c-XXXX'}}"
When i run armclient put $putparm in the script i get this error:
"error": {
"code": "InvalidRequestContent",
"message": "The request content was invalid and could not be deserialized: 'Unterminated string. Expected delimiter: \". Path '',
line 1, position 21.'." }
But when i take the output of the $putparm variable and run the command "manually" on the server, it works.
I guess its something with the way linux store the variables and that the API is requesting JSON (or something..)
Happy for any help.
The way you define your variable putparam is wrong.
It is likely interpreted as a literal string and not as an object. Note that a simple string, like "hello", is a valid JSON data, but it probably not what is expecting your server.
If you should quote your variable correctly:
putparm="{\"Location\":\"$resolved_locationid\",\"Properties\":{\"KeyVaultId\":\"$resolved_keyvaultid\",\"KeyVaultSecretName\":\"$certname\",\"serverFarmId\":\"$resolved_farmid\"}}"
and use it like this:
armclient put "$resolved_armapi" "$putparm"
Here is the situation - jmeter results were recorded in .csv quite a long time ago (around 6 month). The version of jmeter was not changed (3.0) but the config files did. Now I've been trying to generate a report from the old csv as usual - using
jmeter.bat -g my.csv -o reportFolder
Also, to defeat the incompatibility of configurations, I created a file named local-saveservice.properties, and passed it through -q command line option. Playing with settings in this file, I managed to defeat several errors like "column number mismatch" or "No column xxx found in sample metadata", but I still didn't generate the report succesfully, and here is the trouble:
File 'D:\ .....\load_NSI_stepping3_2017-03-24-new.csv' does not contain the field names h
eader, ensure the jmeter.save.saveservice.* properties are the same as when the CSV file was created or the file may be
read incorrectly
An error occurred: Error while processing samples:Consumer failed with message :Consumer failed with message :Consumer f
ailed with message :Consumer failed with message :Error in sample at line:1 converting field:Latency at column:11 to:lon
g, fieldValue:'UTF-8'
However,in my .csv column number 11 has the header "Latency" and contains numeric values, though 'UTF-8' is the content of next column - "Encoding"
Here are first lines of my .csv
timeStamp,elapsed,label,responseCode,responseMessage,success,bytes,grpThreads,allThreads,URL,Latency,Encoding,SampleCount,ErrorCount,Connect,threadName
1490364040950,665,searchItemsInCatalogRequest,200,OK,true,25457,1,1,http://*.*.*.*:9080/em/.....Service,654,UTF-8,1,0,9,NSI - search item in catalog
1490364041620,507,searchItemsInCatalogRequest,200,OK,true,25318,1,1,http://*.*.*.*:9080/em/.....Service,499,UTF-8,1,0,0,NSI - search item in catalog
1490364042134,495,searchItemsInCatalogRequest,200,OK,true,24266,2,2,http://*.*.*.*:9080/em/.....Service,487,UTF-8,1,0,0,NSI - search item in catalog
1490364043595,563,searchItemsInCatalogRequest,200,OK,true,24266,2,2,http://*.*.*.*:9080/em/.....Service,556,UTF-8,1,0,6,NSI - search item in catalog
PS I had to add threadName manually, 'cos it was not saved during initial data recording (my knowledge of Jmeter was even less then now :) )
First you should update to JMeter 3.3 as there are bugs fixed in report generation in the 3 versions after 3.0.
Second add to your command line:
jmeter.bat -p <path to jmeter.properties> -q <path to your custom.properties used when you generated file> -g my.csv -o reportFolder
Ensure that in "your custom.properties" you set to false all fields that are prefixed by jmeter.save.saveservice that didn't yet exist at the time you generated the file.
I'm using rsyslog to watch over my syslogs and send them over to Logstash+Kibana.
My syslogs messages are logged as JSON. They can look something like this:
{"foo":"bar", "timegenerated": 43843274834}
rsyslog configuration as so:
module(load="omelasticsearch")
#define a template to print all fields of the message
template(name="messageToES" type="list" option.json="on") {
property(name="msg")
}
*.* action(type="omelasticsearch"
server="localserverhere"
serverport="80"
template="messageToES")
The Kibana is fine, since if I run a CURL command to it, it receives the record. The code as below:
curl -XPOST myserver/test/bar -d '{"test": "baz", "timegenerated":1447145221519}'
When I run rsyslogs and point it to a dummy server, I can see the incoming requests with the valid json. However, when I point it back to my logstash server, it doesn't show up in logstash or kibana.
Does anyone know how to send syslogs as json into Kibana/logstash?
I've never used it, but it looks like you are missing things from your config file. The docs have a pretty thorough example:
module(load="omelasticsearch")
template(name="testTemplate"
type="list"
option.json="on") {
constant(value="{")
constant(value="\"timestamp\":\"") property(name="timereported" dateFormat="rfc3339")
constant(value="\",\"message\":\"") property(name="msg")
constant(value="\",\"host\":\"") property(name="hostname")
constant(value="\",\"severity\":\"") property(name="syslogseverity-text")
constant(value="\",\"facility\":\"") property(name="syslogfacility-text")
constant(value="\",\"syslogtag\":\"") property(name="syslogtag")
constant(value="\"}")
}
action(type="omelasticsearch"
server="myserver.local"
serverport="9200"
template="testTemplate"
searchIndex="test-index"
searchType="test-type"
bulkmode="on"
queue.type="linkedlist"
queue.size="5000"
queue.dequeuebatchsize="300"
action.resumeretrycount="-1")
Based on what you are trying to do, it looks like you need to plug in localserverhere where it shows myserver.local. It also looks like you have ES accepting stuff on port 80, so you'd put in 80 instead of 9200.
I'm trying to generate a JSON log from nginx.
I'm aware of solutions like this one but some of the fields I want to log include user generated input (like HTTP headers) which need to be escaped properly.
I'm aware of the nginx changelog entries from Oct 2011 and May 2008 that say:
*) Change: now the 0x7F-0x1F characters are escaped as \xXX in an
access_log.
*) Change: now the 0x00-0x1F, '"' and '\' characters are escaped as \xXX
in an access_log.
but this still doesn't help since \xXX is invalid in a JSON string.
I've also looked at the HttpSetMiscModule module which has a set_quote_json_str directive, but this just seems to add \x22 around the strings which doesn't help.
Any idea for other solutions to log in JSON format from nginx?
Finally it looks like we have good way to do this with vanilla nginx without any modules. Just define:
log_format json_combined escape=json
'{'
'"time_local":"$time_local",'
'"remote_addr":"$remote_addr",'
'"remote_user":"$remote_user",'
'"request":"$request",'
'"status": "$status",'
'"body_bytes_sent":"$body_bytes_sent",'
'"request_time":"$request_time",'
'"http_referrer":"$http_referer",'
'"http_user_agent":"$http_user_agent"'
'}';
Note that escape=json was added in nginx 1.11.8.
http://nginx.org/en/docs/http/ngx_http_log_module.html#log_format
You can try to use that one https://github.com/jiaz/nginx-http-json-log - addition module for Nginx.
You can try to use:
addition module for Nginx nginx-http-json-log
Use any language as done in nginx-json-logformat with example /etc/nginx/conf.d/json_log.conf
A version of the Nginx HTTP stub status module that outputs in JSON format
PS:
The if parameter (1.7.0) enables conditional logging. A request will not be logged if the condition evaluates to “0” or an empty string:
map $status $http_referer{
~\xXX 0;
default 1;
}
access_log /path/to/access.log combined if=$http_referer;
It’s a good idea to use a tool such as https://github.com/zaach/jsonlint to check your JSON data. You can test the output of your new logging format and make sure it’s real-and-proper JSON.
I have to take inputs from a CSV file which has a comma separated single line of million 6 character strings. I need to read these one by one and run the JMeter SOAP post query by attaching these values
What I need to know is how to configure this csv file to read data one by one and pass it to the SOAP query. I tried adding a CSV option with a while controller in JMeter. But doesnt seem to work. I need to know the condition in the while controller as well.
Consider using following Test Plan configuration:
Test Plan
Thread Group
While Controller
Counter
SOAP Post Query
CSV Data Set Config
Relevant configuration:
While Controller
Condition: ${__javaScript(${N}<1000000)}
Counter
Start: 1
Increment: 1
Maximum: 1000000
Reference Name: N
SOAP/XML-RPC Request
everything as per your current use case
CSV Data Set Config
Recycle on EOF: true
Stop thread on EOF: false
For my scenario, it didn't require any while controller.
I've done following Test Plan configuration with JMeter 2.9 & JDK 1.6:
Test Plan
Thread Group
SOAP/XML-RPC Request
CSV Data Set Config
Thread Group:
Number of Threads: <as per your requirement>
Ramp-Up Period (in seconds): <as per your requirement>
Loop Count: 1
SOAP/XML-RPC Request:
URL: http://<The WSDL of your web service>?wsdl
Send SOAPAction: <name of action to be performed>
Soap/XML-RPC Data:
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:fac="<your URL>">
<soapenv:Header/>
<soapenv:Body>
<fac:abc>
<arg0>
<var1>${var1}</var1>
<var2>${var2}</var2>
</arg0>
</fac:abc>
</soapenv:Body>
</soapenv:Envelope>
CSV Data Set Config:
Filename: xyz.csv
Variable Names (comma-delimited): var1,var2
Delimiter: ,
Stop Thread on EOF: True
Sharing mode: All threads
Other parameters as per your requirement.
CSV file should be in the same folder where you have saved Test Plan. Otherwise, you need to specify relative path.
xyz.csv:
123,456
789,111
222,333