How to automatically sync updated data from locale database into cloud database(like aws rds) whenever wi-fi network available,Is there any intermediate tool is need to use?
Related
Is it possible to run a local db with cloud backup where if the local db fails it changes over to cloud, so that new data stores in cloud until the failure is resovled, and then once resolved new data is copied to the local db and then the system resumes to use the local db. Any solution can be accepted ie aws or azure but how do i set it up where the db login address and configuration within the application stays the same.
I searched a lot and according my experience, there isn't any tools or Azure Service can do that.
I think it's impossible for now.
I was wondering if custom metadata for google compute engine VM instances was an appropriate place to store sensitive information for configuring apps that run on the instance.
So we use container-optimised OS images to run microservices. We configure the containers with environment variables for things like creds for db connections and other systems we integrate with.
The VMs are treated as ephemeral for each CD deployment and the best I have come up with so far is to create an instance template with config values loaded via a file I keep on my local machine into the VM custom metadata, which is then made available to a systemctl unit when the VM starts up (cloud-config).
The essence of this means environment variable values (some containing creds) are uploaded by me (which don't change very much) and are then pulled from the VM instance metadata server when a new VM is fired up. So I'm just wondering if there's any significant security concerns with this approach...
Many thanks for your help
According to the Compute Engine documentation :
Is metadata information secure?
When you make a request to get
information from the metadata server, your request and the subsequent
metadata response never leaves the physical host running the virtual
machine instance.
Since the request and response are not leaving the physical host, you will not be able to access the metadata from another VM or from outside Google Cloud Platform. However, any user with access the VM will be able to query the metadata server and retrieve the information.
Based on the information you provided, storing credentials for a test or staging environment in this manner would be acceptable. However, if this is a production system with customer or information important to the business, I would keep the credentials in a secure store that tracks access. The data in the metadata server is not encrypted, and accesses are not logged.
We have a legacy system which talks to on-premise MySQL database. We are building a new system and its hosted in Azure. The database it uses to store the data is in Azure Storage. Since the legacy application will be functional for some time, we would like to sync the data from MySQL to Azure Table Storage. Is there any tool that can help in the synchronization of the data from MySQL to Azure Table Storage?
You could use Data-in Replication feature for synchronization of data between MYSQL and Azure database.
Please refer this document for more information.
Hope it helps.
Currently, I am on a requirement to sync data from apache direcotry ldap to any of the RDBMS Databases (MySQL, PostgreSQL). Directory approximately holds a few million of records for now and may grow in future. Ldap directory is being the primary data source for now but the motive is to have real time data in both Ldap as well as in RDBMS since We have a plan to use RDBMS for real-time analytics purpose.
Option1:
Thinking of using spring cloud data flow. A source spring boot app to read ldap data that are changed after the last sync run. Source app pushes data to queue(RabbitMQ for now). Sink would be another spring boot app that collects data directly from queue and persists the data into RDBMS. We will be able to better track and manage the sync process jobs using spring cloud data flow dashboard offerings.
Option2:
Spring LdapTemplate helps us to talk to ldap directory in our application. One approach would be to intercept the ldapTemplate calls wherever applicable and push the data to queue and then an intermediate app reads data from queue(RabbitMQ) and converts the ldap response to the required format that can be updated into RDBMS DB.
I am new to Ldap and spring cloud data flow. So far, I have got only these 2 approaches considering my project's existing technology and system landscape. Any other suggestions/ approach are really appreciated. Thanks in advance.
One another approach if LDAP is Microsoft ad server then creating windows service in C# which will connect to your LDAP server and fetch data every day and send data to your rdbms through socket connection. Which is reliable and consistent.
BACKGROUND-
I am planning to make a website that will accept data from users to store them in a database(MySQL).The website would be served from google cloud servers.I have installed MAMP on my mac for web development.
PROBLEM-
Google cloud services also provide Cloud SQL.Now I have a few doubts-
1)Once I finish designing my website on MAMP and want to deploy it on cloud servers I would have database settings of my local machine.Does this mean that before putting it on cloud and in order to use Cloud SQL as database I would have to change code on back-end side that specifies database settings?If yes then how tedious is it to do so?(Changing database from testing environment from MySQL to deployment environment Cloud SQL).
2)Also is there a way to use cloud and not use Cloud SQL?
3)What else combination can be chosen with database to deploy website on cloud?
Usually changing the database needs huge efforts(testing and some config changes) as all the databases provide many additional features which doesn't work directly on another database.
You can use Cloud(Cloud SQL is just part of it).
But the Cloud SQL is mysql only as per the information given on the below link by google
https://cloud.google.com/products/
So, it should not be a big deal for you to migrate the project to cloud from your local system. Only you have to configure the connection details(it will not be simply localhost).