filemaker pro export and import to mysql via php - mysql

could anyone advise me direct me to a site that explains the best way to go about this I'm sure I could figure it out with allot of time invested but just looking for a jump start. I don't want to use the migration tool either as I just want to put fmp xml files on the server and it create new MySql databases based on the fmpxml results provided
thanks

Technically you can write a XSLT to transform the XML files into SQL. It's pretty much straightforward for data (except data in container fields), but with some effort you can even transfer the scheme from DDR reports (but I doubt it worth it for a single project).

Which version of MySQL? v6 has LOAD XML which will make things easy for you.
If not v6, then you are dealing with stored procedures, which can be a pain. If you need v5, it might make sense to install MySQL6, get the data in there using LOAD XML, and then do a mysqldump, which you can import into v5.
Here is a good link:
http://dev.mysql.com/tech-resources/articles/xml-in-mysql5.1-6.0.html

Related

Can Parse be used in place of my MySQL database? (after converted to NoSQL)

Can I dump my sql database, and upload each table to Parse so that it can serve as a multi-table (whatever the NoSql terminology is) database for my project?
Parse can be used with PHP, JS, and many other languages of the web, so in theory, yes. It also has an import feature so you can import data. I'm not sure how well this feature works, but it is definitely worth a try. Here is the documentation.

MySQL and Core Data in iOS

I want to retrieve data from remote MySQL database and store the data in my iOS app (creating "local" database, so the information can still be accessed even though there is no connection). After doing some research, Apple's Core Data API seems to be the answer. However, it's using SQLite.
Can I use Core data with MySQL? If the answer is no, is there any way to develop "local" database other than Core Data? I tried looking for the answer, but no luck. This is the closest one that I can get, but I don't really understand the answer. I am new at iOS development, so any help is greatly appreciated.
Yes, you can use Core Data with MySQL if you like, but you need to write the persistent store functionality yourself, which is a fairly advanced undertaking. It doesn't seem to have any benefits though. I think it would be better to retrieve the data from the MySQL server, and then store it locally in Sqlite. MySQL requires a separate server so obviously it cannot be run locally on iOS anyway.
You cant use coredata with MySQL. Because CoreData is a local database inside the mobile and MySQL is WebServer database. So we cant combine them. Why you dont like CoreData? It is the most powerful and simple database for the mobile apps. I think CoreData suits for your purpose. If your data is something lightweight. Then you can use
Plist
http://hayageek.com/plist-tutorial/
http://www.theappcodeblog.com/2011/05/30/property-list-tutorial-using-plist-to-store-user-data/
NSCoder
http://www.raywenderlich.com/1914/nscoding-tutorial-for-ios-how-to-save-your-app-data
CoreData is the way to go. CoreData is build under SQLLite but it is a relational data base --> Object Oriented mapping which makes it really convenient.
There's a graphic editor which will allow you to define your CoreData model the way you require it.

Data dump filetype for not-yet-existant SQL database

A friend wants to start scraping data for a data-heavy site he wants me to try to build. I'm a (relatively new) Rails developer and don't know much about the data side of all this. If he's contracting out the scraping, any idea what sort of format can/should I get the data in to easily import it into a PostgreSQL database once I get the site started up?
Hope this isn't too vague a question. I don't know where to start looking for this.
CSV file format is compatible with almost any database systems and it is quite a good starter. Even, if you change your mind later, as for what database system you'll use, you don't have to worry too much about changing the format.
If you thinking about data mining, then probably NoSQL database systems can be a better solution (MongoDB, CouchDB, etc.). Then, then file format can be JSON as well.

How to convert data stored in XML files into a relational database (MySQL)?

I have a few XML files containing data for a research project which I need to run some statistics on. The amount of data is close to 100GB.
The structure is not so complex (could be mapped to perhaps 10 tables in a relational model), and given the nature of the problem, this data will never be updated again, I only need it available in a place where it's easy to run queries on.
I've read about XML databases, and the possibility of running XPATH-style queries on it, but I never used them and I'm not so comfortable with it. Having the data in a relational database would be my preferred choice.
So, I'm looking for a way to covert the data stored in XML into a relational database (think of a big .sql file similar to the one generated by mysqldump, but anything else would do).
The ultimate goal is to be able to run SQL queries for crunching the data.
After some research I'm almost convinced I have to write it on my own.
But I feel this is a common problem, and therefore there should be a tool which already does that.
So, do you know of any tool that would transform XML data into a relational database?
PS1:
My idea would be something like (it can work differently, but just to make sure you get my point):
Analyse the data structure (based on the XML themselves, or on a XSD)
Build the relational database (tables, keys) based on that structure
Generate SQL statements to create the database
Generate SQL statements to create fill in the data
PS2:
I've seen some posts here in SO but still I couldn't find a solution.
Microsoft's "Xml Bulk Load" tool seems to do something in that direction, but I don't have a MS SQL Server.
Databases are not the only way to search data. I can highly recommend Apache Solr
Strategies to Implement search on XML file
Keep your raw data as XML and search it using the Solr index
Importing XML files of the right format into a MySql database is easy:
https://dev.mysql.com/doc/refman/5.6/en/load-xml.html
This means, you typically have to transform your XML data into that kind of format. How you do this depends on the complexity of the transformation, what programming languages you know, and if you want to use XSLT (which is most probably a good idea).
From your former answers it seems you know Python, so http://xmlsoft.org/XSLT/python.html may be the right thing for you to start with.
Take a look at StAX instead of XSD for analyzing/extraction of data. It's stream based and can deal with huge XML files.
If you feel comfortable with Perl, I've had pretty good luck with XML::Twig module for processing really big XML files.
Basically, all you need is to setup few twig handlers and import your data into MySQL using DBI/DBD::mysql.
There is pretty good example on xmltwig.org.
If you comfortable with commercial products, you might want to have a look at Data Wizard for MySQL by the SQL Maestro Group.
This application is targeted especially at exporting and, of course, importing data from/ to MySQL databases. This also includes XML import. You can download a 30-day trial to check if this is what you are looking for.
I have to admit that I did not use the MySQL product line from them yet, but I had a good user experience with their Firebird Maestro and SQLite Maestro products.

Easy data conversion tool

I often have data in Excel or text that I need to get into SqlServer. I can use ODBC to query the Excel file and I can parse the text file. What I want though is some tool that will just grab the data and put it into tables with little / no effort. Does anyone know of such a tool?
Have you tried the SQL Server Import/Export Wizard ?
In SQL Server Management Studio, right-click your Database Name, and select Tasks menu, Import Data. For Data Source, select Microsoft Excel, browse to the .XLS...
If you are using Sql Server look at Integration Services (SSIS).
You can also take a look at parse-o-matic
Use DTS or SSIS depending on which version of SQL Server you have. There is an import wizard which can get you started, but data imports are rarely simple and usually involve some sort of data cleanup so that your incoming data is acceptable to the table where you intend to store it. Excel data, in my experience, is usually particularly bad inthis respect becasue it often isn't stored properly in Excel to begin with.
I haven't seen commercial tools that do this. I create this kind of tools at work all the time, and the data validation is not trivial. This just makes sure that you don't have bad data making it into your database.
I found that for simple data conversion needs something like FileHelpers is pretty good. It still needs programming though. This framework is fairly easy to use, and somebody with a little bit of experience could bang something out for you.
On further thought, you can use the SQL Server bcp utility to upload the contents of a text file. This is a command-line utility and has a lot of switches. I would suggest you experiment on a test table before you use this in a production table.
It's been a while since I used it, so I can't remember if you can directly use an Excel spreadsheet. Text files are always the easiest to deal with in any case.
Seems like it'd be pretty easy to write a script that reads the text file, and converts it to "INSERT * into TABLE" Sql statements. I suspect this has already been done, but a simple implementation would be less than 100 lines of code in your favorite scripting language.
Hey, Google says SQLServer comes with such a tool, BULK INSERT: