Migrating filemaker to MySQL problems - mysql

I'm having great difficulty in migrating a data set of about 4750 recordsfrom filemaker to MySQL. I have tried exporting via CSV but it misses out large chunks of data from a couple of columns, and have tried exporting via ODBC, but cannot get the records to copy over to the shadow MySQL table.
I have also tried going the other direction with ODBC by attempting to get Filemaker to import into MySQL using MySQL Workbench but run into an error of UserID missing each time.
Any suggestions will be greatly appreciated before I take a sledgehammer to the thing.
Thanks!

Related

Need help to export SQL and re-import?

I'm running in to some trouble when it comes to exporting my Magento 1.7 Database (MySQL) I've fully exported everything as-is using MySQL workbench, however - when I try to import to another Magento installation; it fails.
Is there any specific I should be looking for? And do the Database names on the old and new Magento installation need to be the same? If so, would the keys need to match?
For info - I'm running from 1.7 Magento with a broken core installation as our last tech guy wasn't too good; and trying to re-import the Database in to a 1.9 Installation; I've tested individual tables and they run just fine, however, using PHPMyAdmin to export and import through every table is something that WILL take a long time, especially as we have a constantly growing userbase and customerbase.
Any help would be really, really helpful.
Thank you in advance!

publishing Filemaker (database software) data on Wordpress website

We would like to be able to publish Filemaker data on our Wordpress website. The website is up and running and the filemaker database is set up. We do not need a live connection between both systems so we chose to export the FM data to .csv so we can import it to the mysql database on the server and from there we would like to display in on the website.
Now are my questions, since this kind of development is new to us:
can I setup an automated import to the mysql database from a source like dropbox or something? For example can we make the mysql database import and overwrite the existing database each 24 hours from a .csv file located somewhere? We need this automated overwrite option because the FM data changes often and we need up to date info on the website)
How can we display the data from the mysql database on the WP frontend?
I've been looking into this myself and couldn't find any clear answers or guides. Can you guys point me in the right direction?
(btw, I know there are table plugins I can use for WP but they do not fulfill our needs, and I think it's exciting to do it all by ourself with help from this great community)
Update 01
I've successfully connected FM with my MySQL db using ODBC and can now select tables from the MySQL db in FM's relational graph.
I was wondering how I can write the data from my existing FM file to the MySQL db using ODBC, can anybody help me on this?
I would like to display the data in some MySQL tables so I can fetch them using php on my website.
Thanks!
It is possible to write directly into (and read from) a remote MySQL database from FileMaker via ODBC.
You need an MySQL account which allows remote access. There are providers where this is not allowed.
On the local box the odbc driver needs to be installed. On Win you can use the open source version (http://dev.mysql.com/downloads/connector/odbc/), on Mac it works better with the Actual Tech (http://www.actualtech.com/de/product_opensourcedatabases.php) drivers.
An odbc system dsn (not user dsn) is set up. Be sure to use the 32-bit odbc manager on Win.
Now you can create the external data source within FileMaker and read and write into MySQL tables.
Once you have made the connection to the MySQL database, and you can see the shadow tables, you can write to the fields directly via Filemaker layouts. It's as simple as that.
Once the layout contains the fields from the MySQL database you can move through records, find stuff all as if the data were native in your FM database. Of course, for more automated processing, you can create scripts, relationships etc and manipulate/synchronise data. Be warned though, the connection speed can limit complex relationships and large databases. I would advise 'baby steps'.

how to get tables, views, etc.. in mysql database server from a firebird database server

There are two different database servers in different platforms; MySql and Firebird.
I need to get db tables from Firebird server (synchronously or not synchronously) to MySql server.
Timing is not that important.
How do I achieve to get, for example; simply a specific database table from Firebird server to Mysql server?
İt will be very helpful if any ideas come, thank you..
So essentially you would need to develop some sort of process that will extract then transform and load the data from FireBird into MySQL. The problem is that both firebird and MySQL does not have real ETL tools included in them by default. MSSQL has for example SSIS. Since you cant access FireBird directly from MySQL code you will need some external tool to do the job.
Thus you will either have to code this tool yourself or use a open source tool.
There are several ETL tools that are open source and for free that you might want to investigate such as Pentaho, CloverETL see this link for more information.
You told you already found a solution to import CSV into MySQL. So the missing piece is Firebird CSV export
Well, those three words entered into Google give you immediate top result as the free FBExport tool. See http://www.firebirdfaq.org/fbexport.php

Copy data from Postgresql to MySQL

I have encountered a problem where I need to copy only data from a Postgresql database to Mysql database. I already have the Mysql database with empty tables. By using PGAdmin I got a backup (data only, without database schema). I tried using PSQL tool but it keeps giving segmentation fault which I couldn't fix at the moment. I am using Ubuntu. Any simple help with a guide will be highly appreciated to copy data.
Use postgres COPY, and MySQL LOAD DATA INFILE.
psql will crash because of out-of-memory if you try to display a few millions of rows because it fetches all the data beforehand to determine the column widths for a prettier display. If you intend to use psql to fetch lots of data, disable this.
You could try:
http://www.lightbox.ca/pg2mysql.php
It looks like you might be trying to load data into mysql with the postgres client tool. Use the mysql client tools to manipulate data in the mysql server.
mysql client programs
How can you move data into MySQL if you have errors reading from PSQL?
Fix the error, then ask this question.

How to import a .dmp file (Oracle) into MySql DB?

The .dmp is a dump of a table built in Oracle 10g (Express Edition) and one of the fields is of CLOB type.
I was trying to simply export the table to xml/csv files then import it to the MySql, but the export simply ignored the CLOB field... (I was using sqldeveloper for that).
I noticed this post explaining how to extract the CLOB to text file but it seems to miss the handling of the other fields or at least the primary key fields. can it be adopted to create a csv of the complete table? (I am not familiar with plsql at all)
As the brute force approach, I can use my python interface to simply query for all the records and spool it to a flat file but I'm afraid it will take a LOOOONG time (query for all records replace all native commas with the ascii... )
Thanks guys!
if you can get the mysql server and the oracle server on the same network, you might want to look at the mysql administrator tools, which includes the migration toolkit. you can connect to the oracle server with the migration toolkit and it will automatically create tables and move data for you.
Here is a documentation explaining the migration process: http://www.mysql.com/why-mysql/white-papers/mysql_wp_oracle2mysql.php
and you can use Data Wizard for MySQL . Trial version is fully usable for 30 days.
After about 2 hours of installing and uninstalling the MySql on the same machine (mylaptop) in order to use the migration tool kit as suggested by longneck, I decided to simply implement the dump and here it is for the likes of me that have minimal admin experience and get hard time to make both DBs work together (errors 1130, 1045 and more).
Surprisingly, it is not as slow as I expected: OraDump
Any comments and improvements are welcomed.