Currently, I am on a requirement to sync data from apache direcotry ldap to any of the RDBMS Databases (MySQL, PostgreSQL). Directory approximately holds a few million of records for now and may grow in future. Ldap directory is being the primary data source for now but the motive is to have real time data in both Ldap as well as in RDBMS since We have a plan to use RDBMS for real-time analytics purpose.
Option1:
Thinking of using spring cloud data flow. A source spring boot app to read ldap data that are changed after the last sync run. Source app pushes data to queue(RabbitMQ for now). Sink would be another spring boot app that collects data directly from queue and persists the data into RDBMS. We will be able to better track and manage the sync process jobs using spring cloud data flow dashboard offerings.
Option2:
Spring LdapTemplate helps us to talk to ldap directory in our application. One approach would be to intercept the ldapTemplate calls wherever applicable and push the data to queue and then an intermediate app reads data from queue(RabbitMQ) and converts the ldap response to the required format that can be updated into RDBMS DB.
I am new to Ldap and spring cloud data flow. So far, I have got only these 2 approaches considering my project's existing technology and system landscape. Any other suggestions/ approach are really appreciated. Thanks in advance.
One another approach if LDAP is Microsoft ad server then creating windows service in C# which will connect to your LDAP server and fetch data every day and send data to your rdbms through socket connection. Which is reliable and consistent.
Related
I have an application on Sql Server that send change data to Target database using Sql Service Broker. I just capture the data from Trigger and push data into service broker Queue. Now I want to make compatible my application to MySql. Now the problem is how i achive exactly the same implementation in MySql because its not supported. If I use external Message Broker like RabbitMq how MySql Table Trigger directly communicate with RabbitMq.
Thanks in Advance
If you can use Kafka as an alternative to RabbitMQ there are some tutorials on how to accomplish a similar goal:
Using Kafka Connect
Using GCP services
Change data capture (CDC) is a term used to classify this pattern of reacting to data changes and then delivering those changes in real-time to a downstream process.
I was wondering if custom metadata for google compute engine VM instances was an appropriate place to store sensitive information for configuring apps that run on the instance.
So we use container-optimised OS images to run microservices. We configure the containers with environment variables for things like creds for db connections and other systems we integrate with.
The VMs are treated as ephemeral for each CD deployment and the best I have come up with so far is to create an instance template with config values loaded via a file I keep on my local machine into the VM custom metadata, which is then made available to a systemctl unit when the VM starts up (cloud-config).
The essence of this means environment variable values (some containing creds) are uploaded by me (which don't change very much) and are then pulled from the VM instance metadata server when a new VM is fired up. So I'm just wondering if there's any significant security concerns with this approach...
Many thanks for your help
According to the Compute Engine documentation :
Is metadata information secure?
When you make a request to get
information from the metadata server, your request and the subsequent
metadata response never leaves the physical host running the virtual
machine instance.
Since the request and response are not leaving the physical host, you will not be able to access the metadata from another VM or from outside Google Cloud Platform. However, any user with access the VM will be able to query the metadata server and retrieve the information.
Based on the information you provided, storing credentials for a test or staging environment in this manner would be acceptable. However, if this is a production system with customer or information important to the business, I would keep the credentials in a secure store that tracks access. The data in the metadata server is not encrypted, and accesses are not logged.
We have a legacy system which talks to on-premise MySQL database. We are building a new system and its hosted in Azure. The database it uses to store the data is in Azure Storage. Since the legacy application will be functional for some time, we would like to sync the data from MySQL to Azure Table Storage. Is there any tool that can help in the synchronization of the data from MySQL to Azure Table Storage?
You could use Data-in Replication feature for synchronization of data between MYSQL and Azure database.
Please refer this document for more information.
Hope it helps.
I'm working with an API app on Azure by deploying an API written in NodeJs which stores data in MongoDb.
Make a new API for web & mobile apps from Azure portal.
Choose MongoDb by adding MongoLab module from Azure.
Create a table (collection) and populate it with few entries.
Prepare our Git repository from Azure portal and link it to our local Git on computers.
Decide which NodeJs modules to use to set the dependencies.
Edit the configuration file for NodeJs.
Make the main API file with the following functionalities:
Connection to the database.
Running the service.Making CRUD operation services (CREATE, READ,DELETE...)
Testing our API on a browser.
Making an application using our API
My question: how to use these steps to store data in mysql database(azure)?
I cannot fully understand your requirement, do you want to use the MySQL database to store your data in your API application in Node.js? If not, please clarify your purpose.
To implement connection to MySQL in Node.js, you can use some 3rd part MySQL handler modules.
For example:
node-mysql - A pure node.js JavaScript Client implementing the MySql protocol
or
Sequelize - A promise-based ORM for Node.js and io.js
Any further concern, please feel free to let me know.
I have a small problem, I am working on a project where I need to synchronize the databases to maintain the same data at diff locations, all have their own database the structure of all the database,tables,stored procedures,etc are the same. i.e. they are the copy of a single database but data is entered from different locations and I need to have data to be synchronized so that we can see the data from anywhere, will creating a windows service work?
C# code will be a nice help.
You can do this by two ways .....
1st way using combination of services
i) W CF Service with windows service Or Web service with Windows Service.
you will need a windows service which is installed on every client machince which will check the new data in local db and send that data to web or wcf service which will be installed on Server Machine for LAN.
but if client is on network then web service or w cf service should be on Static IP Address where Server Database is installed
and if client is offline in some case then u should use MSMQ or Rabbit MQ or any queuing mechanism for that ...to handle such case..
2nd way is using Replication in in sq l server 2008 go through this links:
http://www.codeproject.com/Articles/215093/Replication-in-MS-SQL-Server
http://www.informit.com/guides/content.aspx?g=sqlserver&seqNum=313) Database syncronization
Replication is what you need I guess, let SQL manage where you dont have write any code and is easy to configure.
have a look at http://www.codeproject.com/Articles/215093/Replication-in-MS-SQL-Server , or type SQL replication and there are loads of material available.