Need help to export SQL and re-import? - mysql

I'm running in to some trouble when it comes to exporting my Magento 1.7 Database (MySQL) I've fully exported everything as-is using MySQL workbench, however - when I try to import to another Magento installation; it fails.
Is there any specific I should be looking for? And do the Database names on the old and new Magento installation need to be the same? If so, would the keys need to match?
For info - I'm running from 1.7 Magento with a broken core installation as our last tech guy wasn't too good; and trying to re-import the Database in to a 1.9 Installation; I've tested individual tables and they run just fine, however, using PHPMyAdmin to export and import through every table is something that WILL take a long time, especially as we have a constantly growing userbase and customerbase.
Any help would be really, really helpful.
Thank you in advance!

Related

Migrate Table Data Only from MSSQL to MySQL

Been tasked with moving a code first database from MSSQL to MySQL. After a few hours of kung fu, I was able to get the asp.net core project to properly deploy all migrations to mysql. Now I need to migrate the data inside of the existing mssql tables. I saw posts mentioning MySQL Migration Toolkit but that appears to be old. Also attempted to do so with MySQL Workbench and DBLoad's Data Loader but haven't had any luck.
Table structure is pretty simple with incremental integer keys + the usual crap with asp.net core identify framework (GUID). Just need to keep that consistent during the migration. What is the best way to migrate the data now that the table structure is setup in MySQL? Any recommendations would be greatly appreciated!
Update: Some more details...
I attempted a direct migration of the database from MSSQL to MySQL using MySQL WorkBench and DBLoader. But failed on the ASP.net Identity tables big time plus other issues. .net core api took a huge dump in multiple places so that idea is out.
From that point, I migrated the api controller over to mysql and then had to fix a myriad of issues related to mssql fks being too long and a few other issues.
So at this point, the controller works on mysql. I just need to dump all of the data into MySQL and keep the FKs consistent.
I have had a few thoughts with it such as CSV export>import and/or trying a few other things. Any recommendations?
You can use Data Export wizard built-in with SqlServer Management Studio (task -> Export Data)
or use SSIS package to migrate data.
Tried MySQL Workbench, DBLoad from DBLoad.com etc to migrate the data directly and they all failed.
So ended up finding a "solution"...
First off, I modified the ASP.net Core project with:
//services.AddDbContext<ApplicationDbContext>(options =>
// options.UseSqlServer(
// Configuration.GetConnectionString("DefaultConnection")));
services.AddDbContext<ApplicationDbContext>(options =>
options.UseMySql(
Configuration.GetConnectionString("DefaultConnection")));
Then went to Package Manager Console and ran: update-database
This created all of the tables in MySQL.
Then opened up Microsoft SQL Management Studio. Right clicked on the database Tasks > Generate Scripts. Saved everything to one file with Advanced option Schema and Data selected.
Then opened the db script in Notepad++ and applied the following edits using Replace with Extended Search Mode enabled:
GO -> blank
[dbo]. -> blank
[ -> blank
] -> blank
)\r\n -> );\r\n
\r\n' -> '
DateTime2 -> DATETIME
After edits were made in Notepad++, I removed all of the SET, ALTER and CREATE related stuff from the text file and then copied all of the insert lines into MySQL Workbench in order to ensure foreign keys were already populated before that table's data was inserted. Thank goodness there were no Stored Procedures to deal with! What a pain in the tucus!
Also on a side note, the app is hosted on Azure. Spent a couple of hours fighting the API not connecting to the database. This was not apparent at first as the app was fake throwing a 404 error to Postman. When I first attempted to wire up the API controllers to the MySQL DB, I entered the database connection string into Azure's App Service Configuration. It didn't work at all even though running the app locally worked fine. Ended up finding another post on this here site mentioning to get rid of the database connection string out of the App Service > Configuration window. Worked like a champ after that. All of the data with its auto incremented keys linked up without issue.
I am very pleased with the results and hope I never have to go through this process again! It is always a nice feeling to know an app now runs on a completely open source infrastructure. Hope this helps someone. Good luck.

Migrating filemaker to MySQL problems

I'm having great difficulty in migrating a data set of about 4750 recordsfrom filemaker to MySQL. I have tried exporting via CSV but it misses out large chunks of data from a couple of columns, and have tried exporting via ODBC, but cannot get the records to copy over to the shadow MySQL table.
I have also tried going the other direction with ODBC by attempting to get Filemaker to import into MySQL using MySQL Workbench but run into an error of UserID missing each time.
Any suggestions will be greatly appreciated before I take a sledgehammer to the thing.
Thanks!

Exporting from Filemaker to MySQL...in 2014

I have FM11A (client only for now, but the project should run on a FM Server either 11 or 12-13) on a WinXP machine where I also run a MySQL server (5.5) for testing purposes.
I have a database fully working on filemaker and I am developing a mirror MySQL database to go online.
My aim is to be able to perform bidirectional sync between Fm and MySQL leaving the 2 databases as entirely independent entities (so I would avoid to have FM write directly on MySQL tables, i.e. having FM as front-end and MySQL as back-end).
I have been able to import the MySQL table (demographics) into the FM database (where another 'demographics' table is present) the 2 tables have exactly the same fields, and importing from MySQL-Demogr into FM-Demogr using ODBC\ESS works very well.
When I open FM and Import records from MySQL using the shadow table everything goes smoothly and I can see new records on the original FM table, as I wished.
Of note, I am also able to write data directly on MySQL-Demogr table using FM and writing on the shadow Table.
The problem comes when I try to export FM data to MySQL: apparently the ODBC\ESS system works very well in 1 direction (FM import from *SQL) but not the other (FM export to *SQL) I am still trying to figure out the most efficient (i.e. easy\quick and scalable) way to export records, originally inserted in FM, in MySQL.
The old way would be to script an export to .csv file from FM, then loading new data to MySQL, maybe using a temporary table inside MySQL. This is supposed to be very quick and is absolutely doable, although I would rather use ODBC\ESS if at all possible
The easiest way would be to export directly from FM into MySQL using the shadow table but it does not work:
a. Exporting from FM to the same file or to an ODBC source (MySQL) is apparently not possible (might you please confirm?)
b. When I open the MySQL shadow table (MySQL-Demogr) from inside FM and Import new records (this time going from FM-Demogr --> MySQL-Demogr) it says records have been added in MySQL, but in the facts, nothing happened and when I go to MySQL the table is unchanged.
Another chance is to use filemaker with or without specific plugin to run an SQL query and let it access the MySQL-Demogr shadow Table through ODBC...I have looked into some examples available online and this is not entirely clean and cut to me, but I was reviewing records of 2003-2009, apparently in the pre-ESS era. Maybe with the new ExecuteSQL script step things are a little bit mopsr straightforward now ? If you have any advice on specific plugins (under $100) that could help me in this, I am also interested in making the investment
Finally, I could use a third package (either Excel or SQLYog) to run the SQL for me connecting the 2 databases (FM and MySQL) and make the script to run on a regular base. No issues with that but I would keep everything inside FM-MySQL if at all possible.
Thank you very much in advance for your help.
Since you mentioned you're open to 3rd party software, I'll mention MirrorSync (http://mirrorsync.com), which can do what you need. Disclaimer: I am the author of the software, so I am obviously biased ;-)

Merge two databases with the missing contents

I have recently migrated one website to a new linux server. But we imported the database of the website was 7days old the actual thing. Now 4 days gone after the migration. So these 4 days database updates which was wriiten to the new server that contains the contents of old server database. We just forget about the 7days database updates from the old server at the time of dns change.
But now our website is having a big issue because they didnt have the missing 7 days database contents. if we imported old database from old server to new server then our latest 4 days database updates on new server will go. Iam in the middle of this.
Can you please suggest a better way to merge these databases into one without any issues or overwritten, so that we can update the same to new server then only the site will run fine. In short we need to merge two databases with the missing contents and site should work fine also. Help me please.
This is going to depend a lot on your data structure (how many auto_increment fields you have, foreign keys, etc.). You will need to dump out everything that was added to the database after the export but prior to the migration, then find a way to import each record.
What is stored in the database?

How to import a .dmp file (Oracle) into MySql DB?

The .dmp is a dump of a table built in Oracle 10g (Express Edition) and one of the fields is of CLOB type.
I was trying to simply export the table to xml/csv files then import it to the MySql, but the export simply ignored the CLOB field... (I was using sqldeveloper for that).
I noticed this post explaining how to extract the CLOB to text file but it seems to miss the handling of the other fields or at least the primary key fields. can it be adopted to create a csv of the complete table? (I am not familiar with plsql at all)
As the brute force approach, I can use my python interface to simply query for all the records and spool it to a flat file but I'm afraid it will take a LOOOONG time (query for all records replace all native commas with the ascii... )
Thanks guys!
if you can get the mysql server and the oracle server on the same network, you might want to look at the mysql administrator tools, which includes the migration toolkit. you can connect to the oracle server with the migration toolkit and it will automatically create tables and move data for you.
Here is a documentation explaining the migration process: http://www.mysql.com/why-mysql/white-papers/mysql_wp_oracle2mysql.php
and you can use Data Wizard for MySQL . Trial version is fully usable for 30 days.
After about 2 hours of installing and uninstalling the MySql on the same machine (mylaptop) in order to use the migration tool kit as suggested by longneck, I decided to simply implement the dump and here it is for the likes of me that have minimal admin experience and get hard time to make both DBs work together (errors 1130, 1045 and more).
Surprisingly, it is not as slow as I expected: OraDump
Any comments and improvements are welcomed.