import very large XML file (60GB) into MySQL - mysql

I have a XML file with a size of almost 60 GB that I would like to import into a MySQL database. I have root access to my server, but I don't know how to handle files of this size.
You guys have any idea?
Normally I use Navicat, but it gave up...
Thanks

This is a little out of my area of knowledge but would this work ?
LOAD XML LOCAL INFILE '/pathtofile/file.xml'
INTO TABLE my_tablename(name, date, etc);
I know this sort of thing work with <1GB files, but I've yet to work with large files.
Hope this helps !
EDIT
If that doesn't work for you go take a look at the LOAD DATA documentation http://dev.mysql.com/doc/refman/5.1/en/load-data.html

You could use a command line xml splitter to split it into manageable size files first. Google to find one.

Related

Large csv impossible to handle

I am trying to access a 2.2 GB csv file. Excel and R are useless for this. SAS could have worked, but it seems that the file is corrupted and SAS cannot handle that. I am trying to do something with Python, but no luck so far. Any advice would be welcome, thanks.
Just for accessing the file: vim and gvim have large file plugins, depending on your OS.

Split huge mysql insert into multiple files suggestions

I have a Huge mysql dump I need to import, I managed to split the 3gig file by table insert, one of the table inserts is 600MBs, I want to split it into 100 MB files. So my question is: is there a script or easy way to split a 600MB INSERT statement into multiple 100MB inserts without having to open the file (as this kills my pc).
I tried SQLDumpSplitter but this does not help.
here is the reason I cannot just run the 600MB file:
MYSQL import response 'killed'
Please help
On Linux, easiest way to split files is split -l N - split to pieces N lines each.
On Windows, I've had pretty good luck with HxD - it works well with huge files.
You can easily open a file of 1GB on Textpad software. User this software to open the file and split your queries as what you want.
Link for downloading TextPad software TextPad

How to handle a big CSV file?

I am planning to add the list of all cities in the world to my application (BTW: I am using Ruby, Ruby on Rails and MySQL) so I thought to use the CSV file downloaded from the www.maxmind.com website.
However, I am worried and doubtful because the unpacked file is about 151,1 MB on disk (!) and I should put all those values in my database. How do you advice to proceed (also for MySQL indexes...)?
Using LOAD INFILE is the only way to import it, but index and performance considerations will be dependant on what you import and how you're going to use it. Research, research, research...good starting poing is Large Files with LOAD DATA INFILE

Exporting a MySQL database through PHPMyAdmin

I'm hoping to export a database that I only have access to through phpMyAdmin so that I can make a copy of it on my localhost. I've never done this before and the database is fairly large at 200 tables. Does anyone have experience doing this? I'm just unsure if the web interface of phpMyAdmin is a reliable way to export that much data or if I'd be causing some performance issues by attempting to export the data.
Thanks for any advice. Thy phpMyAdmin version is 2.1 if that helps any.
In the table, select Export, tick the "Save as file" option, and keep the selection as "Compression: None".
You will be able to download huge data tables like this.
The issue you will come against is the max_execution_time setting.
What i have found is large databases take longer to dump than what is set here (defaults to 30 seconds).
This will cause your export to fail.
Also make sure you are not trying to dump to the browser, I have found that option unreliable. Choose the save to a file option, and download the dump via ftp .
But as Col. Shrapnel said, try it first!

How to open problematic data dump and convert it to mysql (or some other practical format)?

I'm trying to work with pretty interesting data set. Unfortunately, I have problems with opening it and converting it into any useful format. It's collection of archived txt files. When i decompress them and try to open txt file i get 'it's binary file, saving it may result in corrupt file' and it's unreadable - there are just 'weird characters', nothing i could manually read. Tried to use grep but it's also complaining that it's binary. I tried to import it into mysql database but when i tried to execute LOAD DATA LOCAL INFILE ‘/path/to/file.txt’ INTO TABLE tablename I got ERROR 1064 (42000). I don't know what to do to open it.
I'm going to take a wild guess here and say they're not really text files :-)
And until you figure what they are, trying to import them via different methods is not going to be the most productive course of action.
Can you post (a hex code of) headers for those files? If they have magic sequence in front, it should help with figuring out what they are.