Deleting Records in database through kafka jdbc connector - mysql

I am trying to connect Kafka connector with database using confluent kafka jdbc connector.
I can do insert and update the records from one database to another, but when I try to delete the records, the deleted records are not reflecting into target db. I mean synchronization of records(deleted records) are not deleted from target database.

You won't be able to until this gets merged
https://github.com/confluentinc/kafka-connect-jdbc/pull/282

Related

Updating Debezium MySQL connector to resume with database moved to RDS

I have moved my MySQL database to RDS.
I would like to update the connector that publishes messages from an outbox table to a specific Kafka topic to point to the RDS instance.
What would be the steps I would have to take to not resend the messages and just continue from the point at which the connector was for the previous instance?

Insert data from one MySQL to another MySQL instance

I have 2 servers running mysql and they have the same database/table, but with different entries. There was a reason for that, but now I need to merge these databases. I've made a dump of the whole database in server 1 and I want to insert these data into the database of server 2. The problem is: How do I insert this data just incrementing the database of server 2? I can't lose the data that already is on server 2.

Real time update of Data(CDC approach) from mysql to HDFS or Hive table

I have installed CDH 5.16 in a RHEL 7 server and installed kafka separately.
I am trying to load data from mysql to HDFS or Hive table on real time basis(CDC approach). That is if some data is updated or added in mysql table ,it should be immediately reflected in HDFS or Hive table.
Approach i have come up with:
Use kafka-connect to connect to mysql server and push table data to a kafka topic
and write a consumer code in spark-stream which reads the data from topic
and store it in HDFS.
One problem with this approach is, hive table on top of these files should
be refreshed periodically for the update to be reflected.
I also came to know of Kafka-Hive integration in HDP 3.1. Unfortunately i am using Hadoop 2.6.0. So cant leverage this feature.
Is there any other better way achieve this?
I am using Hadoop 2.6.0 and CDH 5.16.1

Database sync between SQL Server and Mysql

I have two different application which run on different database, SQL Server and MySQL. Now I want real time data sync between SQL Server and Mysql - can anybody please tell me. I have already tried data migration, but it copies only data not sync data
Take a look at StreamSets. Open source and can read the SQL Server change tracking (CT) tables and turn them into inserts, updates, and deletes to any database with a JDBC driver including MySQL. It can also use the MySQL binlog to build data pipelines in the other direction.

MSSQL to MYSQL one way data syncing

i have one MySQL server.
in that server there are the 15 databases and i need to sync the data to MsSQL to MySQL and the condition is a data transmit only MsSQL to MySQL on every update. so please guide me how to perform this task and how to set cron job.