Split huge mysql insert into multiple files suggestions - mysql

I have a Huge mysql dump I need to import, I managed to split the 3gig file by table insert, one of the table inserts is 600MBs, I want to split it into 100 MB files. So my question is: is there a script or easy way to split a 600MB INSERT statement into multiple 100MB inserts without having to open the file (as this kills my pc).
I tried SQLDumpSplitter but this does not help.
here is the reason I cannot just run the 600MB file:
MYSQL import response 'killed'
Please help

On Linux, easiest way to split files is split -l N - split to pieces N lines each.
On Windows, I've had pretty good luck with HxD - it works well with huge files.

You can easily open a file of 1GB on Textpad software. User this software to open the file and split your queries as what you want.
Link for downloading TextPad software TextPad

Related

How to view a .sql file that is too large to open in an editor?

I have got the .sql file as result of backing up my whole website data. I have the SQL file with me now. All I want is to view the data as tables. The file is quite big and weighs 700 mb. I have MySQL software with me. When I try to open the file, it first ask whether to open or just run, saying it is quite big file to handle. Selecting run make the software immediately stuck and eventually report problem and close.
If I select open, after long time it open up showing many sentences of codes with insert, and all. If I choose to run the SQL from there, it again get stuck. So, it too become impossible to view the table. Is there any alternative way to view the SQL file as a table? Using any software or any online ways, to view the table.
I suggest you import your dump into a local database, then you will be able to navigate it and run queries against it.

import very large XML file (60GB) into MySQL

I have a XML file with a size of almost 60 GB that I would like to import into a MySQL database. I have root access to my server, but I don't know how to handle files of this size.
You guys have any idea?
Normally I use Navicat, but it gave up...
Thanks
This is a little out of my area of knowledge but would this work ?
LOAD XML LOCAL INFILE '/pathtofile/file.xml'
INTO TABLE my_tablename(name, date, etc);
I know this sort of thing work with <1GB files, but I've yet to work with large files.
Hope this helps !
EDIT
If that doesn't work for you go take a look at the LOAD DATA documentation http://dev.mysql.com/doc/refman/5.1/en/load-data.html
You could use a command line xml splitter to split it into manageable size files first. Google to find one.

load many txt files to mysql db

I want to store 7000 records consisting of txt files as datatype BLOB in my DB using workbench.
But:
1. I don't know how to do it automatically? Should I put all the files in one catalogue and then write a script to take them one by one and insert in the adequate rows?
2. I am not sure if BLOB is fine for this type of file storage? Later I want to connect my DB with GUI so after clicking, it should be possible to open each txt file in new window.
Could you advice me how to solve my problem?
You should write a script, yes. If it's hard for you to put them all in one folder I think there are scripts and tools to do this.
You can use C#, PHP or any other lang to scan those files and then insert them into the database.
Bunch of tutorials:
http://www.csharp-examples.net/get-files-from-directory/
Inserting record in to MySQL Database using C#
http://www.codeproject.com/Articles/43438/Connect-C-to-MySQL
http://net.tutsplus.com/articles/news/scanning-folders-with-php/
http://www.homeandlearn.co.uk/php/php13p3.html
Blob should do, takes around 20 megabytes of text.

how to import notepad .txt file in phpmyadmin MySql

I have a .txt file in notepad that's 31 mb and I'm unsure as to how to import it into phpmyadmin. I've looked at similar questions on stackExchange to no avail. Everytime I try to upload it I get the error "You probably tried to upload too large file. Please refer to documentation for ways to workaround this limit." Thinking that the file was too large I even modified it to only 3 lines as a test and I got an entirely different error! I'm sure one of my problems is that there is no option that says .txt and I can only choose from CSV, SQL, DocSQL, Open Document Spreadsheet and XML. So I'm assuming I must covert my .txt file to one of those files to upload unless of course there's an entirely different way of importing that I am not aware of.
Some of the code in question on my .txt file reads as
CREATE TABLE IF NOT EXISTS occupationalinfo (
`ID` VARCHAR(255),
`Title` VARCHAR(255),
`DESCRIPTION` VARCHAR(255)) TYPE=MyISAM;
INSERT IGNORE INTO occupationalinfo (`ID`,`Title`,`DESCRIPTION`) VALUES
('1','Architects, Except Landscape and Marine','CODE: 22302 TITLE: Architects, Except Landscape and Marine DEFINITION: Plan and design structures, such as private residences, office buildings, theaters, factories, and other structural property. TASKS KNOWLEDGE SKILLS ABILITIES WORK ACTIVITIES WORK CONTEXT INTERESTS WORK VALUES CROSSWALKS TASKS: 1. Prepares information regarding design, structure specifications, materials, color, equipment, estimated costs, and construction time. 2. Plans layout of project. 3. Integrates engineering element into unified design.
I am still in the early learning phases of programming so any tips followed by explanations would be greatly appreciated
You can change the extension of the file from .txt to .sql. It's no mystery, they're both text files. As long as it contains SQL like you have above, you will be fine.
If you have command-line access to the server, upload the file separately and run it from the command line using mysql.
You can also try temporarily increasing the upload file-size limit, although 31 MB is really big.
If you have a modern installation of phpmyadmin with the PHP zip extension, you can change the file extension to .sql, zip it and then upload it. It should be radically smaller than 31 MB.
The most definite solution however is to break it up: go through your file and separate out each table, and each collection of say 500 insert statements. Save them as separate files, upload and run them in order in phpmyadmin.

insert csv file into MySQL with user id

I'm working on a membership site where users are able to upload a csv file containing sales data. The file will then be read, parsed, and the data will be charted. Which will allow me to dynamically create charts
My question is how to handle this csv upload? Should it be uploaded to folder and stored for later or should it be directly inserted into a MySQL table?
Depends on how much processing needs to be done, I'd say. if it's "short" data and processing is quick, then your upload-handling script should be able to take care of it.
If it's a large file and you'd rather not tie up the user's browser/session while the data's parsed, then do the upload-now-and-deal-with-it-later option.
It depends on how you think the users will use this site.
What do you estimate the size of the files for these users to be?
How often would they (if ever) upload a file twice, can they download the charts?
If the files are small and more for one-off use you could upload it and process it on the fly, if they require repetitive access and analysis then you will save the users time by importing the data to the database.
The LOAD DATA INFILE command in MySQL handles uploads like that really nice.If you make the table you want to upload it to and then use that command it has worked great and super quick for me. I've loaded several thousand rows of data in under 5 seconds using it.
http://dev.mysql.com/doc/refman/5.5/en/load-data.html