I am working on a project to log connection tracking events with ulogd2. I want to know if there is any way to send messages to a remote host in JSON format. So far, I was able to save the message to a JSON file on the local server, but I don't want to save it in the local machine, I just want to send it. Or maybe if there is a way to send the file and after that delete it.
I would really appreciate your help.
Usually syslog logs are sent to remote servers via one of the syslog network protocols, rsyslog is often used for this purpose. On the receiving end you can also use rsyslog to collect it into the syslog of the remote server.
On the receiving side you can also use tools like fluentd to collect the syslog messages and write them to a file as JSON or do a number of other tings with them. An alternative tool to do this as well is called filebeat.
You can also install tools on the server creating the logs, instead of rsyslog which you can configure to send them via some method other than the syslog protocol if you so desire. Filebeat can do this and fluent bit(lightweight version of fluentd) as well.
Related
I am trying to configure a user defined parameter on a Windows host. All my hosts are configured with PSK encryption and Zabbix server is able to get data without any issues.
However I cannot figure out how to use the zabbix_get manually with PSK encryption enabled.
zabbix_get -s x.x.x.x -p 10050 -k "internet.connection.check" --tls-connect=psk --tls-psk-identity="name" --tls-psk-file=cannot find any psk file on zabbix server
The problem is I cannot locate any PSK file on the zabbix server. Can I pass the PSK somehow?
The serverside PSK is configured in the GUI and stored in the database.
The Zabbix agent stores the PSK in a file.
I see 3 options:
Manually create a psk-file.
Remember that a change of the key must be done in the GUI, at the agent and in your special file.
Make a script that reads the key from the database.
Remember that direct access to the database of an application is most times forbidden and can cause compatibility issues after updating the application. Read-only access should be possible.
Use the same keys for all your agents;
When you install a Zabbix Agent on the Zabbix Server (allowing you to monitor the server), you do have a file on a normal place.
I wouldn't try to use an API or some smart script during Discovery, this will make the solution hard to maintain. I withdraw my last remark, when you have thousands of servers to monitor and a team working with Zabbix.
I would like to use Dokku for deploying my Rails apps. But I cannot find any method allows me to send the log to Zabbix? Does anyone have ideas? Thanks!
You can't send logs directly to Zabbix for it's not a log collector.
You need a Zabbix Agent installed to your app machine to analyze logs and trigger events or, if you are using a PaaS, you should implement web scenarios on your Zabbix Server to check specific URLs.
If you want to collect logs instead, you could implement a ELK stack.
Elastic search has its own alerting module but it's paid and IMHO Zabbix alerting is far better.
I'm writing an application which delivers data from remote devices over an HTTP API. These devices are on a mobile data connection and have limited resources.
I wish to receive custom monitoring data over the HTTP API, relying on the security model designed in the application, and push that data to Zabbix directly (or indirectly) from node.js. I do not wish to use Zabbix Agent on the remote devices.
I see that I can use zabbix_sender to send data to a Zabbix server containing a pre-configured host. This works great. I intend to deliver monitoring data over my custom API, and when received give this data to zabbix_sender inside the server network.
The problem is there are many devices in the field and more are being added all the time.
TL;DR:
When zabbix_sender provides a custom hostname which doesn't exist in Zabbix already, it fails.
I would like to auto-add discovered hosts, based upon new hostnames from zabbix_sender. How would I do this?
Also, extra respect if anyone can give examples of how to avoid zabbix_sender and send data directly from node.js to the Zabbix server. I mean: suggest an NPM package that you have experience using. (Update: Found working node.js package here: https://www.npmjs.com/package/node-zabbix-sender)
Zabbix configuration: I'm learning from Zabbix 2.4 installed in Docker, no custom configuration from this Dockerhub: https://hub.docker.com/r/zabbix/zabbix-2.4/
Probably the best would be to use the Zabbix API to create hosts directly.
Alternatively, you could set up an action and emulate active agent connection, which would make Zabbix create the host via the active agent auto-regstration.
You could also use low level discovery (LLD) to send in JSON, which would result in hosts/items being created, based on prototypes.
In all of these cases you have to wait for one minute (by default) for the hosts to appear in the Zabbix cache, then you can send the data.
Also note that Zabbix 2.4 is not supported anymore, it will receive no fixes - it is not a "long-term support" release.
In my application, I have used apache HttpClient. Now I would like to monitor the request and response from the capture data transmitted by the httpclient. Is it possible to know which ports are opened by the HttpClient. I am using Linux, so can I use netstat to see which ports are open by my java application process?
You can get detailed info about connection management by turning on context logging as described here
anyone know of:
a utility that will run on the Apple Mac that will
listen to changes on a set of tables on a local sqlite table and via
a configuration mechanism to then
HTTP post those ongoing changes as JSON requests
to a remote http or apache camel server?
Thanks, Martin.