Send JSON from rsyslog to Kibana - json

I'm using rsyslog to watch over my syslogs and send them over to Logstash+Kibana.
My syslogs messages are logged as JSON. They can look something like this:
{"foo":"bar", "timegenerated": 43843274834}
rsyslog configuration as so:
module(load="omelasticsearch")
#define a template to print all fields of the message
template(name="messageToES" type="list" option.json="on") {
property(name="msg")
}
*.* action(type="omelasticsearch"
server="localserverhere"
serverport="80"
template="messageToES")
The Kibana is fine, since if I run a CURL command to it, it receives the record. The code as below:
curl -XPOST myserver/test/bar -d '{"test": "baz", "timegenerated":1447145221519}'
When I run rsyslogs and point it to a dummy server, I can see the incoming requests with the valid json. However, when I point it back to my logstash server, it doesn't show up in logstash or kibana.
Does anyone know how to send syslogs as json into Kibana/logstash?

I've never used it, but it looks like you are missing things from your config file. The docs have a pretty thorough example:
module(load="omelasticsearch")
template(name="testTemplate"
type="list"
option.json="on") {
constant(value="{")
constant(value="\"timestamp\":\"") property(name="timereported" dateFormat="rfc3339")
constant(value="\",\"message\":\"") property(name="msg")
constant(value="\",\"host\":\"") property(name="hostname")
constant(value="\",\"severity\":\"") property(name="syslogseverity-text")
constant(value="\",\"facility\":\"") property(name="syslogfacility-text")
constant(value="\",\"syslogtag\":\"") property(name="syslogtag")
constant(value="\"}")
}
action(type="omelasticsearch"
server="myserver.local"
serverport="9200"
template="testTemplate"
searchIndex="test-index"
searchType="test-type"
bulkmode="on"
queue.type="linkedlist"
queue.size="5000"
queue.dequeuebatchsize="300"
action.resumeretrycount="-1")
Based on what you are trying to do, it looks like you need to plug in localserverhere where it shows myserver.local. It also looks like you have ES accepting stuff on port 80, so you'd put in 80 instead of 9200.

Related

Adding Variable to JSON CURL

I am trying to add a text string to a CURL request with JSON. See below for the $test variable. When I submit this, the application sees it as literal $test.
--data "{"fields": {"project": {"key": "Ticket"},"summary":"Account - Missing Tags","description":"The following AWS assets do not have the minimally required tags................ $test ","issuetype": {"name": "Service"}}}}"
I have tried various methods such as "'$Test'" and that hasn't worked either. Can you help explain how to accomplish this?

ARM.Template from bash-script. Unterminated string. Expected delimiter:

I am writing a bash-script for uploading certificate from a linux-server to azure keyvault using the "armclient"
I follow this guide on how to use the armclient:
https://blogs.msdn.microsoft.com/appserviceteam/2016/05/24/deploying-azure-web-app-certificate-through-key-vault/
The command i want to perform is this:
ARMClient.exe PUT /subscriptions/<Subscription Id>/resourceGroups/<Server Farm Resource Group>/providers/Microsoft.Web/certificates/<User Friendly Resource Name>?api-version=2016-03-01 "{'Location':'<Web App Location>','Properties':{'KeyVaultId':'<Key Vault Resource Id>', 'KeyVaultSecretName':'<Secret Name>', 'serverFarmId':'<Server Farm (App Service Plan) resource Id>'}}"
I have created a string that populates all the fields required:
putparm=$resolved_armapi" \"{'Location':'$resolved_locationid','Properties':{'KeyVaultId':'$resolved_keyvaultid','KeyVaultSecretName':'$certname','serverFarmId':'$resolved_farmid'}}"\"
When i echo the output of the variable putparm, the result looks as expected (X-ed out names/ids):
/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/certificates/XXXX-XXXXX-XXXXX?api-version=2016-03-01 "{'Location':'Central US','Properties':{'KeyVaultId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.KeyVault/vaults/XXXXXXXX','KeyVaultSecretName':'XXXX-XXXXX-XXXXX','serverFarmId':'/subscriptions/f073334f-240f-4261-9db5-XXXXXXXXXXXXX/resourceGroups/XXXXXXXX/providers/Microsoft.Web/serverfarms/ServicePlan59154b1c-XXXX'}}"
When i run armclient put $putparm in the script i get this error:
"error": {
"code": "InvalidRequestContent",
"message": "The request content was invalid and could not be deserialized: 'Unterminated string. Expected delimiter: \". Path '',
line 1, position 21.'." }
But when i take the output of the $putparm variable and run the command "manually" on the server, it works.
I guess its something with the way linux store the variables and that the API is requesting JSON (or something..)
Happy for any help.
The way you define your variable putparam is wrong.
It is likely interpreted as a literal string and not as an object. Note that a simple string, like "hello", is a valid JSON data, but it probably not what is expecting your server.
If you should quote your variable correctly:
putparm="{\"Location\":\"$resolved_locationid\",\"Properties\":{\"KeyVaultId\":\"$resolved_keyvaultid\",\"KeyVaultSecretName\":\"$certname\",\"serverFarmId\":\"$resolved_farmid\"}}"
and use it like this:
armclient put "$resolved_armapi" "$putparm"

NXLog: Json input to GELF UDP Output

We have a setup where a program logs to a .Json file, in a format that follows the GELF specification.
Currently this is sent to a Graylog2 server using HTTP. This works, but due to the nature of HTTP there's a significant latency, which is an issue if there is a large amount of log messages.
I want to change the HTTP delivery method to UDP, in order to just 'fire and forget'.
The logs are written to files like this:
{ "short_message": "<message>", "host": "<host>", "full_message": "<message>", "_extraField1": "<value>", "_extraField2": "<value>", "_extraField3": "<value>" }
Current configuration is this:
<Extension json>
Module xm_json
</Extension>
<Input jsonLogs>
Module im_file
File '<File Location>'
PollInterval 5
SavePos True
ReadFromLast True
Recursive False
RenameCheck False
CloseWhenIdle True
</Input>
<Output udp>
Module om_udp
Host <IP>
Port <Port>
OutputType GELF_UDP
</Output>
With this setup, part of json log message is added to the "message" field of a GELF message, and sent to the server.
I've tried adding the line `Exec parse_json(), but this will simply result in all fields other than short_message and full_message being excluded.
I'm unsure how to configure this correctly. Even just having the complete log message added to a field is preferable, since I can add an extractor on the server side.
You'd need Exec parse_json() in order for GELF_UDP to generate proper output but it was unclear what the exact issue is with message and full/short_message.
Another option you could try is simply ship the log via om_tcp. In this case you'll not need to use OutputType GELF_TCP since it is already formatted that way.

Prolog Json handle

server():- http_server(http_dispatch, [port(45000)]).
serverTest(Request):-http_read_json(Request, JSONIn),
json_to_prolog(JSONIn, PrologIn), format(PrologIn) .
I have this Prolog program but I can't handle the PrologIn variable very well. I get this error:
Type error: `text' expected, found `json([id=3])'
I know that means I can't use format with PrologIn but how do I handle the information inside? Meaning, how do I extract the "id = 3" info ?
Edit:
This is the full program
(If I use more than enough modules, is because I'm doing other stuff with the program and didn't filtered to this specific case)
:- use_module(library(http/thread_httpd)).
:- use_module(library(http/http_dispatch)).
:- use_module(library(http/http_parameters)).
:- use_module(library(http/http_ssl_plugin)).
:- use_module(library(http/http_open)).
:- use_module(library(http/http_client)).
:- use_module(library(http/http_json)).
:- use_module(library(http/json)).
:- http_handler('/test', serverTest, []).
The rest is the first two predicts before the edit
I test this by first going on the Prolog's console and typing "server().", this starts the server. Then I use Postman in the following way: Select POST, in the Headers the key is Content-Type and its value is application/json, then, in the Body, I select raw (JSON(application-json)) and write this in the text area:
{
"id" : 3
}
This is how I test it, I want to be able to handle the id=3 information in the prolog predicate (serverTest).
You really need to show the full program, how you start the server, and how you query it. Otherwise, one can only guess.
Anyway, few problems with this: format(PrologIn).
First, as the program tells you, this is a term. And format does formatted output. At the very least, you would have to write:
format("~w", [PrologIn])
See the documentation on format/2, basically, if your term looks like this: json([id=3]), you should get json([id=3]) printed.
Now the next question: where would this print to? When you start a server with the HTTP package libraries, input and output are redirected so that you can read requests and write responses. There are many examples in the library documentation.
Then the next thing: how you get the 3 out of there. If you additionally load the http_json plugin module:
:- use_module(library(http/http_json)).
then you can directly use, as shown in the code example,
http_read_json_dict(Request, DictIn)
Now DictIn is a "dict" probably looking like this: _{id:3}. See the documentation on dicts.
You don't have to use dicts, just inspect the json term using normal pattern matching and list processing. Dicts are just easier (as in, less typing) for some use cases.
Edit
Here is a minimal example that works for me. This is the server code:
:- use_module(library(http/thread_httpd)).
:- use_module(library(http/http_dispatch)).
:- use_module(library(http/http_json)).
:- http_handler('/test', test, []).
server :-
http_server(http_dispatch, [port(45000)]).
test(Request) :-
http_read_json_dict(Request, JSON),
format(user_error, "~w~n", [JSON.id]).
From the top level, after consulting the file, I run:
?- server.
% Started server at http://localhost:45000/
true.
At that point, I use curl from another command line like this:
$ curl -H "Content-Type: application/json" -d '{"id":3}' http://localhost:45000/test
curl: (52) Empty reply from server
and I get the 3 printed out on the Prolog toplevel where the server is running:
?- 3
This is of course not ideal, so I replace the last line in the server code with the following:
reply_json(_{foobar:JSON.id}).
and then on the Prolog toplevel, where the server is running, I use make/0:
?- make.
% Updating index for library .../lib/swipl-7.3.35/library/
% ... compiled 0.00 sec, 0 clauses
true.
Now, when I again use curl:
$ curl -H "Content-Type: application/json" -d '{"id":3}' http://localhost:45000/test
{"foobar":3}
This is all you need!
I don't know your Prolog web library, but I'm guessing http_read_json already binds its second argument to a Prolog term so the the call to json_to_prolog is unnecessary and incorrect.
Try
serverTest(Request) :- http_read_json(Request, JSONIn), format(JSOnIn).
If you want to isolate the id number from what you receive, this could be as simple as
serverTest(Request) :- http_read_json(Request, json([id=X])),
% ... do something with X value here ... %

couchDB restore error : Missing JSON list of 'docs'

when copying a database from one host to another I get the folowing error : Missing JSON list of 'docs'
Here is what I do :
source> curl -X GET http://127.0.0.1:5984/cozy/_all_docs?include_docs=true > cozy.dump
destination> curl -X PUT http://127.0.0.1:5984/cozy
{"ok":true}
destination> curl -d #cozy.dump -H "Content-type: application/json" -X POST http://localhost:5984/cozy/_bulk_docs
{"error":"bad_request","reason":"Missing JSON list of 'docs'"}
any idea ?
Thanks !
This is, indeed, a problem with versions.
Fortunately it is fairly easy to fix: just change the first line in the dump, eg.
{"total_rows": 8244, "offset": 0, "rows": [
to
{"docs": [
The dumps can now be used in the later versions.
I know this is an old question but I am still posting an answer in case some one else is looking for the solution. The bulk docs api accepts the request in a certain form.
{docs:[{},{},{}]}
The docs key must contain an array of documents to be bulk inserted. What op did with
curl -X GET http://127.0.0.1:5984/cozy/_all_docs?include_docs=true > cozy.dump
was that he simply stored the couchdb response of the format
{
total_rows: 4,
offset: 0,
rows: [....]
}
into the cozy.dump file. As we have seen above this file is not in a form that can be consumed by the bulk docs api. Hence the error
{"error":"bad_request","reason":"Missing JSON list of 'docs'"}
Couchdb needs a JSON list of docs to perform the bulk insert.
Another point to be noted here is that if you supply an _id and _rev parameter couchdb performs a bulk update rather than a bulk insert. If you just want to copy one database to another use http://wiki.apache.org/couchdb/Replication