Is it possible to dynamically create items and triggers? - zabbix

I have a file with list of processes on each hosts. This list is different on each hosts. Can I dynamically create items and triggers in zabbix that check each process on each hosts?

This is quite an open-ended question, but you can use:
Zabbix API
XML import
low level discovery (LLD)

Related

Send SQL data to new created stack

I have a problem with my docker microservice architecture and different approaches to solve this problems. I hope you can help me with your input.
The question: what is the best way to extract the data from the controllerDB and send this to the stack at creation time?
Given this architecture:
controllerService:
Connected to a docker socket to create stacks by a given compose file. Implemented in python 3.7 and pydocker.
controllerDB:
Data for the new stacks. MySQL database
The problem: every created stack needs a different subset of data from the database. The table which stores the data has columns donĀ“t needed by the stack. This columns only indicate that this data are already used by a stack.
The approaches
create a mysql dump and send this to the mysql initdb folder.
create a docker config file by a python function and add it to the stack
send the data directly from controllerDB to stackDB
some other approach I don't see
Ideas
Is it possible to dump a database by a SQL query? I only know that a can select the dumping rows by the --where= option of mysqldump but not way to exclude columns.
controllerService send a query to controllerDB. This dataset is created into a config file and send to the stack. The Service hast to insert it to the database by its own.
I don't know if this is possible.
Regards
MarLei

Multiple Zabbix Dashboard's Integration

Can we integrate multiple Zabbix Dashboards into a Single Dashboard for Monitoring?
Like i have an instance in X Location and other in Y Location. I need to show a dashboard to management with all the available triggers.
Thanks in advance.
I suggest a single Grafana with multiple Zabbix datasources.
You can set up a single dashboard or multiple dashboards with graphs and trigger from multiple sources, quick host/item/server selection with templating etc...
Below are the 2 solutions which i found to integrate different Zabbix environments.
Grafana tool need to be installed and all the Zabbix instances data need to be pushed to Grafana for visualization. In this method we are getting data from API.
For all the Zabbix database a web socket need to be built and push the data to a Redux store and built a React UI, so that only changed data will be available in the UI.

Restoring to a known state

Couchbase CLI comes with the cbbackup and cbrestore commands which I had hoped would allow me to take a database in a known state and back it up and then restore it somewhere else where only a newly installed instance exists. Unfortunately it appears that the target database must have all the right buckets setup already and (possibly) that the restore command requires that each bucket name be mentioned explicitly.
This wouldn't pose too much of a problem if I were hand-holding the process but the goal is to start a new environment in a fully automated fashion and I'm wondering if someone has a working method of achieving this goal.
If it where me, I'd use the CLI, REST API or one of the Couchbase SDKs to write something to automate the creation of the target bucket then do the restore.
REST API:
http://docs.couchbase.com/couchbase-manual-2.5/cb-rest-api/#creating-and-editing-buckets
CLI:
http://docs.couchbase.com/couchbase-manual-2.5/cb-cli/#couchbase-cli-commands
Another option you might look into is to use these same kinds of methods to automate set up of uni-directional XDCR from the source to the target cluster.

Multiple Databases in neo4j using py2neo

Is it possible to create multiple databases or instances in neo4j, similar to the way one can create multiple databases in mysql? I found the commentary at the link below, but despite the promising title, it did not seem to answer my question. I am running the community version of neo4j, version 1.9.5 on a Mac with py2neo REST interface.
For additional context, I might want to create one database (or graph instance) for mapping nodes and relationships in a work email/contact list, and a completely separate instance for a personal family tree. I tried adding a filename to the instantiation of the GraphDatabaseService method, like so:
graph_db = neo4j.GraphDatabaseService("http://localhost:7474/db/data/graph.db")
But that did not work. Obviously, I am new to graph databases and neo4j, but I have had some previous experience in the relational database area, primarily mysql. Once again, the Q&A in the link below did not seem to completely answer my question.
Thanks.
Anyway to have multiple databases on a neo4j instance?
It is one database per port in neo. You can spin up multiple processes listening on multiple ports, if needed.

Trigger node.js when changes made in apache mysql

I'm building a simple commenting system using node and i need to configure this in a PHP project running in Apache server. So, i need to trigger node.js when some changes made in MySQL database table present in the Apache server. So, i need to know whether it is possible to do this in a Apache server? If so, then how to do that? Any idea or suggestions on this are greatly welcome. Please help...
I guess there are few options you could take, but I don't think you can get some sort of triggered action from within MySQL or Apache. IMHO, you these are the approaches you can take:
you can expose a HTTP API from node and every time you need to notify the node app, you could simply insert the data into MySQL using PHP and then issue a simple GET request to trigger node.
You could use some sort of queuing system (rabbitmq, redis, etc.) to manage the messages to and from the two application, hence orchestrating the flow of the data between the two apps (and later the db).
you could poll the database from node and check for new rows to be available. This is fairly inefficient and quite tricky, but it sounds more close to what you want.