I have FM11A (client only for now, but the project should run on a FM Server either 11 or 12-13) on a WinXP machine where I also run a MySQL server (5.5) for testing purposes.
I have a database fully working on filemaker and I am developing a mirror MySQL database to go online.
My aim is to be able to perform bidirectional sync between Fm and MySQL leaving the 2 databases as entirely independent entities (so I would avoid to have FM write directly on MySQL tables, i.e. having FM as front-end and MySQL as back-end).
I have been able to import the MySQL table (demographics) into the FM database (where another 'demographics' table is present) the 2 tables have exactly the same fields, and importing from MySQL-Demogr into FM-Demogr using ODBC\ESS works very well.
When I open FM and Import records from MySQL using the shadow table everything goes smoothly and I can see new records on the original FM table, as I wished.
Of note, I am also able to write data directly on MySQL-Demogr table using FM and writing on the shadow Table.
The problem comes when I try to export FM data to MySQL: apparently the ODBC\ESS system works very well in 1 direction (FM import from *SQL) but not the other (FM export to *SQL) I am still trying to figure out the most efficient (i.e. easy\quick and scalable) way to export records, originally inserted in FM, in MySQL.
The old way would be to script an export to .csv file from FM, then loading new data to MySQL, maybe using a temporary table inside MySQL. This is supposed to be very quick and is absolutely doable, although I would rather use ODBC\ESS if at all possible
The easiest way would be to export directly from FM into MySQL using the shadow table but it does not work:
a. Exporting from FM to the same file or to an ODBC source (MySQL) is apparently not possible (might you please confirm?)
b. When I open the MySQL shadow table (MySQL-Demogr) from inside FM and Import new records (this time going from FM-Demogr --> MySQL-Demogr) it says records have been added in MySQL, but in the facts, nothing happened and when I go to MySQL the table is unchanged.
Another chance is to use filemaker with or without specific plugin to run an SQL query and let it access the MySQL-Demogr shadow Table through ODBC...I have looked into some examples available online and this is not entirely clean and cut to me, but I was reviewing records of 2003-2009, apparently in the pre-ESS era. Maybe with the new ExecuteSQL script step things are a little bit mopsr straightforward now ? If you have any advice on specific plugins (under $100) that could help me in this, I am also interested in making the investment
Finally, I could use a third package (either Excel or SQLYog) to run the SQL for me connecting the 2 databases (FM and MySQL) and make the script to run on a regular base. No issues with that but I would keep everything inside FM-MySQL if at all possible.
Thank you very much in advance for your help.
Since you mentioned you're open to 3rd party software, I'll mention MirrorSync (http://mirrorsync.com), which can do what you need. Disclaimer: I am the author of the software, so I am obviously biased ;-)
Related
I am using SSIS to move data between a local MSSQL server table to a remote MYSQL table (Data flow, OLEdb source and ODBC Destination). this works fine if im only moving 2 lines of data, but is very slow when using the table I want which has 5000 rows that fits into a csv of about 3mb, this currently takes about 3 minutes using ssis's options, however performing the steps below can be done in 5 seconds max).
I can export the data to a csv file copy it to the remote server then run a script to import straight to the DB, but this requires a lot more steps that I would like as I have multiple tables I wish to perform the steps on.
I have tried row by row and batch processing but both are very slow in comparison.
I know I can use the above steps but I like using the SSIS GUI and would have thought there was a better way of tackling this.
I have googled away multiple times but have not found anything that fits the bill so am calling on external opinions.
I understand SSIS has its limitations but I would hope there is is a better and faster way of achieving what I am trying to do. If SSIS is so bad I may as well just rewrite everything into a script and be done with it, But I like the look and feel of the Gui and would like to move my data in this nice friendly way of seeing things happen.
any suggestions or opinions would be appreciated.
thank you for your time.
As above have tried ssis options including a 3rd party option cozyroc but that sent some data with errors (delimiting on columns seemed off) now and again, different amount of rows being copied and enough problems to make me not trust the data.
We are finally moving from Excel and .csv files to databases. Currently, most of my Tableau files are connected to large .csv files (.twbx).
Is there any performance differences between PostgreSQL and MySQL in Tableau? Which would you choose if you were starting from scratch?
Right now, I am using pandas to join files together and creating a new .csv file based on the join.(Example, I take a 10mil row file and drop duplicates and create a primary key, then I join it with the same key on a 5mil row file, then I export the new 'Consolidated' file to .csv and connect Tableau to it. Sometimes the joins are complicated involving dates or times and several columns).
I assume I can create a view in a database and then connect to that view rather than creating a separate file, correct? Each of my files could instead be a separate table which should save space and allow me to query dates rather than reading the whole file into memory with pandas.
Some of the people using the RDMS would be completely new to databases in general (dashboards here are just Excel files, no normalization, formulas in the raw data sheet, etc.. it's a mess) so hopefully either choice has some good documentation to lesson the learning curve (inserting new data and selecting data mainly, not the actual database design).
Both will work fine with Tableau. In fact, Tableau's internal data engine is based on Postgres.
Between the two, I think Postgres is more suitable for a central data warehouse. MySQL doesn’t allow certain SQL methods such as Common Table Expressions and Window Functions.
Also, if you’re already using Pandas, Postgres has a built-in Python extension called PL/Python.
However, if you’re looking to store a small amount of data and get to it really fast without using advanced SQL, MySQL would be a fine choice but Postgres will give you a few more options moving forward.
As stated, either database will work and Tableau is basically agnostic to the type of database that you use. Check out https://www.tableau.com/products/techspecs for a full list of all native (inbuilt & optimized) connections that Tableau Server and Desktop offer. But, if your database isn't on that list you can always connect over ODBC.
Personally, I prefer postgres over mysql (I find it really easy to use psycopg2 to write to postgres from python), but mileage will vary.
I have made some changes to the design of backend database which is being used by some clients. Now I need to replace the older backend database file with the new one, but at the same time import all the older records to the newer one too. The design changes are not major and no field names were changed. When I try to import tables from older to newer database through External Data->Access, it imports the complete tables instead of just their data. How do I import the data ONLY from older tables in older database to newer database? I tried to design a append query, but couldn't find a way to fetch data from a different database. I am using Access 2010 if it matters.
Please help. Thanks!
I would strongly suggest creating code within your new FE file which creates the new tables, relationships, indexes and new fields on existing tables. And runs any update queries required although it sounds like in your current situation there weren't any.
Updating an Access Backend MDBs structure using VBA code
The Compare'Em utility has made this process much easier.
I would suggest keeping a version number of the FE and BE in a table thus helping you to figure out when to run the code or not allow a new FE to be run against an old format BE.
I have inherited a legacy Access app that has a TON of code running before the main form comes up. This code imports data, deletes data, and changes data a million ways.
Is there a way where (after the startup stuff is finished) I can list the tables and when each of them had data affected most recently?
Thanks!
Sorry, but I'm afraid the answer to your question is simply: No.
Here's an idea though:
Make a backup of the database file.
Open the app so it runs the code you are concerned about.
Compare the DB to the backup using a tool like Red Gate SQL Data compare.
BTW: I'm not sure if the RedGate tool works against access databases, but FMSInc. has one that claims to.
I'm considering a MySQL to Postgresql migration for my web application, but I'm having a really hard time converting my existing MySQL database to Postgresql.
I tried :
mysldump with --compatible=postgresql
migration wizard from EnterpriseDB
Postgresql Data Wizard from EMS
DBConvert from DMSoft
and NONE of the above programs do a good job converting my database!
I saw some Perl and Python scripts for converting mysql to postgresql, but I can't figure out how to use them....(I installed ActivePerl and don't understand what I'm supposed to do next to run that script!)
I use Auto Increment fields (as a primary key) all the time, and these are just ignored... I understand that Postgresql does auto-increments in another way (with sequences), but it can't be THAT hard for MIGRATION software to implement that, or is it?
Did anybody have better luck converting a MySQL database with auto-increments as primary keys?
I know this is probably not the answer you are looking for, but: I don't believe in "automated" migration tools.
Take your existing SQL Scripts that create your database schema, do a search and replace for the necessary data types (autonumber maps to serial which does all the sequence handling automagically for you), remove all the "engine=" stuff and then run the new script against Postgres.
Dump the old database into flat files and import them into the target.
I have done this several times with sample databases that were intended for MySQL and it really doesn't take that long.
Probably just as long as trying all the different "automated" tools.
Why not use an ETL Tool? you dont have to worry about dumps or stuff like that.
I have migrated to PostgresSQL and MySQL and have had no problems with the auto increment fields.
You just need to know the connection credentials and thats it. I personally use Pentaho ( it's open source ).
Download Pentaho ETL from http://kettle.pentaho.org/
Unzip and run Pentaho (using .bat file spoon.bat)
Create a new Job:
Create DB connection for source data base (PostgreSQL) - using menu: Tools→Wizard→Create DataBase Connection (F3) Create DB connection for destination data base (Mysql) - using technique described above.
Run the Wizard: Tools → Wizard → Copy Tables (Ctrl-F10).
Select source (left dialog panel), and destination (left dialog panel). Click Finish.
The Job will be generated - Run the job.
If you need any help let me know.
Even when you familiar with all "PostgreSQL gotchas", doing every step by hand may take a lot of time, especially when your db is "big".
Try some other scripts/tools.
I know this is an old question but I just ran into the same problem migrating from MySQL to Postgres. After trying several migration tools out the very best one I could find, which will migrate your database structure as cleanly as possible, was Pgloader https://github.com/dimitri/pgloader/ it will take care of changing the Auto Increment to Postgres sequences no problem and it's super fast.