How get data from outer database in zabbix? - zabbix

Can I get data from outer database through zabbix?
I have outer database of my service and in zabbix I need get some data from this database.
This is possible?

yes, it is possible to read a database using zabbix. See Out-of-the-box database monitoring
An other option is to use zbxdb that I wrote and still maintain. To be successful you need to have some knowledge of zabbix, the database you want to query and when using ODBC, how to configure ODBC. zbxdb uses native drivers.
Don't forget that Zabbix is a monitoring tool. So your queries will normally deliver key/value pairs and not just rows.

Related

How to query different mySQL databases with Prometheus for business monitoring

Is it possible to monitor business data with prometheus which is stored in different mysql databases?
I want to create a dashboard with an alerting system in grafan cloud and get the data from prometheus. In order to connect prometheus to mysql I need a sql_exporter.
But where would I define the queries for my business monitoring?
Does this approach make any sense? If not, what else could I do?
Thanks
Yes you could use sql-agent, prometheus-sql, prometheus and grafana.
With sql-agent you can take the queries you want to monitor, formatted as JSON and execute them on database backend. From there you can use prometheus-sql to query your SQL database and parse these metrics to Prometheus. In Prometheus you can define alerts for your queries. Grafana is for visualization.
If your databases are in different servers, install prometheus-sql to each of them and target them from Prometheus.

Postgres to Mysql - Transfer data from one database to another every day

I have a regular Rails application that uses a Postgres database, but I have the following requirement: every day I have to transfer data from all tables of this Postgres database to the customer's MySQL database.
There's no API available, so I have to connect to the customer's database and perform create/update queries for the new/updated rows. They will allow my IP for these operations.
What would be the best way to achieve that? I thought of some options:
1) Schedule a job on my Rails application to perform that operation (the con here is: this logic is specific for one customer, so I don't like the idea of having this on the main codebase)
2) Deploy a lightweight application (maybe node/express) that reads from one database and sends to another (the con here is: I'll have to maintain another server to keep this running)
Are there any other options that I am not considering?
You could use a foreign data wrapper to connect to the MySQL database from your PostgreSQL database. That would allow you to read and write to the customer database with very little that you would need to write or maintain.
It looks like there is a well maintained wrapper for MySQL.

What's the most efficient way to transfer data from one AWS RDS instance to another

I am working for a client who uses multiple RDS (MySQL) instances on AWS and wants me to consolidate data from there and other sources into a single instance and do reporting off that.
What would be the most efficient way to transfer selective data from other AWS RDS MySQL instances to mine?
I don't want to migrate the entire DB, rather just a few columns and rows based on which have relevant data and what was last created/updated.
One option would be to use a PHP script that'd read from one DB and insert it into another, but it'd be very inefficient. Unlike SQL Server or ORACLE, MySQL also does not have the ability to write queries across servers, else I'd have just used that in a stored procedure.
I'd appreciate any inputs regarding this.
If your overall objective is reporting and analytics, the standard practice is to move your transactional data from RDS to Redshift which will become your data warehouse. This blog article by AWS provides an approach to do it.
For the consolidation operation, you can use AWS Data Migration Service which will allow you to migrate data column wise with following options.
Migrate existing data
Migrate existing data & replicate ongoing changes
Replicate data changes only
For more details read this whitepaper.
Note: If you need to process the data while moving, use AWS Data Pipeline.
Did you take a look at the RDS migration tool?

Join tables from multiple remote database

I have a mysql database on server A, a postgres database on server B and another mysql database on server C. I need a way to join tables from the three servers to get a combined result. Is there a way to do this in ruby ? If not ruby any other language will also suffice.
I need to join somewhere around a few 1000 rows of data. The joined data needs to get pushed to elasticsearch. I was going to use the _bulk api in elasticsearch to push it.
If it's a one-off, just download the data from two of the DBs and upload it to the third. Do your work there.
If it's a regular thing it might be worthwhile putting the effort into linking the databases properly. PostgreSQL offers Foreign Data Wrapper plugins that let you talk to a different database (PostgreSQL or others). One is the mysql_fdw.
You define entries for each remote server, user mappings between local user and the remote user, and then describe each table to access. After that you can treat them as local (although performance will generally be much worse of course).
In theory you could write your own fdw plugin to link to elasticsearch too but there doesn't seem to be one currently available.
You can do it by LINQ and Entity Framework in Microsoft .Net

MySQL distributed database with mysql access to each node

I have task to implement particular database structure:
Multiple mysql servers with data using the same schema. Each server can see and edit only his particular part of data.
And
One master server with his own data that can run queries using data from all previously mentioned servers, but cannot edit them.
Example would be multiple hospitals with data of their patients and master server that can use combined data from all hospitals.
Previous system was written using mysql cluster, so i tried it naturally. I can create mysql cluster with multiple nodes and maybe even partition data so i can have particular set of data in particular node, but as far as i know i can't connect to single node using mysql, because it is already connected to cluster.
Can it be done with mysql cluster? Is there other framework that can do that easily?
You could try to use http://galeracluster.com/. You can perform updates on all slaves and every server has all data, but it might still meet your requirements.