I want to migrate from mysql to postgresql to improve my data's control but I found several problems:
I have two applications that can only be run over mysql and a new
aplication that I want to migrate to postgresql.
I need to redesign several parts of the old database (mysql) and
apply the new design in the new database (postgresql).
If I insert a new row in some tables of old database, this row
should be synchronized to the new database with the new design
For doing this, I've been thinking to use a data wrapper, and synchronize data using triggers( I only need to synchronize data in one way (mysql to postgresql)).
Do you think that's a good choise to solve this problem?
Any ideas/advise?
Related
My task is move the particular set of data from db to another db both are mysql but ran in different instances, in that old db i used mappings(one-to-one and one-to-many). If it is possible to move the data like in this scenario?
how i move the particular set of data from one db to another in mysql
Have you tried exporting the data from the old database and import to the new one? As a visual tool, you can use MySQL Workbench for export/import.
I want to query data from two different MySQL databases to a new MySQL database.
I have two databases with a lot of irrelevant data and I want to create what can be seen as a data warehouse where only relevent data should be present coming from the two databases.
As of now all data gets sent to the two old databases, however I would like to have scheduled updating so the new database is up to speed. There is a key between the two databases so in best case I would like all data to be present in one table however this is not crucial.
I have done similar work with Logstash and ES, however I do not know how to do it when it comes to MySQL.
Best way to do that is create a ETL process with Pentaho Data Integrator or any ETL tool. Where your source will be two different databases, in the transformation part you can remove or add any business logic then load those data into new database.
If you create this ETL you can schedule it once a day so that your database will be up to date.
If you want to do this without an ETL than your database must be in same host. Than you can just add database name just before table name in query. like SELECT * FROM database.table_name
I have to compare two database tables and fields ( Not their DATA) .
Say Live DB and Development DB. In live DB there are 200 tables and
in development DB there are 218 Tables With some new column fields
added in old tables too.
During the final stage i have to update the Live DB by adding the new
columns and tables taken from the development DB .
I should not loss and data in the old DB.
I tried many ways to do this but it is taking a very long time for me
to finish. One of the tool i used is mysql workbench.
Is there any queries to perform this using information schema ?
To make the things easier i imported the development DB from development server and uploaded in the live db with different name . Now both the DB are in the same server but with diff name. E.g. sitedb , sitedevdb
Whenever you update the development DB with changes that will eventually be put live, you should write a .sql script that will repeat the changes on the live database.
Otherwise this operation is super-tricky.. you may have to manually figure out the added columns yourself, or you may be able to use each INFORMATION_SCHEMA.COLUMNS to compare the old tables.
For the new tables SHOW CREATE **table_name** is really useful.
A MySQL comparison tool is probably what you need for this. The company I work for, Redgate, offers MySQL Compare (for the database schema) and MySQL Data Compare (for data).
These tools are free for non-commercial use.
I have two databases one for production and another for staging.
Now I have made many changes in the staging database and I want to make the production the same structure as the testing without dropping any table or losing any data.
I want to find a way where I can create alter table from staging database and apply it on the production database. Note the tables have many columns so I don't want to have to do that manually..
You can use SQLyog trial or SQLyog Community edition, with this tool you be able to apply the changes to the production database using Schema and Data Synchronization feature.
We are handling a data aggregation project by having several microsoft sql server databases combining to one mysql database. all mssql database have the same schema.
The requirements are :
each mssql database can be imported to mysql independently
before being able to import each record to mysql we need to validates each records with a specific createrias via php.
each imported mssql database can be rollbacked. It means even it already imported to mysql, all the mssql database can be removed from the mysql.
we would still like to know where does each record imported to the mysql come from what mssql database.
All import process will be done with PHP .
we have difficulty in many aspects. we don't know what is the best approach to solve our problem.
your help will be highly appreciated.
ps: each mssql database has around 60 tables and each table can have a few hundred thousands .
Don't use PHP as a database administration utility. Any time you build a quick PHP script to transfer records directly from one database to another, you're going to cause yourself a world of hurt when that script becomes required for production operation.
You have a number of problems that you need solved:
You have multiple MSSQL databases with similar if not identical tables.
You have a single MySQL database that you want to merge the data into.
The imported data must be altered in a specific way before being merged.
You want to prevent all duplicate records in your import.
You want to know what database each record originally came from.
The solution?
Analyze the source MSSQL databases and create a merge strategy for them.
Create a database structure on the MySQL database that fits the merge strategy in #1, including all the new key constraints (like unique and foreign keys) required for the consolidation.
At this point you have two options left:
Dump the data from each of the source databases into raw data using your RDBMS administration utility of choice. Alter that data to fit your merge strategy and constraints. Document this, and then merge all of the data into your new database structure.
Use a tool like opendbcopy to map columns from one database to another and run a mass import.
Hope this helps.