I have a .txt file in notepad that's 31 mb and I'm unsure as to how to import it into phpmyadmin. I've looked at similar questions on stackExchange to no avail. Everytime I try to upload it I get the error "You probably tried to upload too large file. Please refer to documentation for ways to workaround this limit." Thinking that the file was too large I even modified it to only 3 lines as a test and I got an entirely different error! I'm sure one of my problems is that there is no option that says .txt and I can only choose from CSV, SQL, DocSQL, Open Document Spreadsheet and XML. So I'm assuming I must covert my .txt file to one of those files to upload unless of course there's an entirely different way of importing that I am not aware of.
Some of the code in question on my .txt file reads as
CREATE TABLE IF NOT EXISTS occupationalinfo (
`ID` VARCHAR(255),
`Title` VARCHAR(255),
`DESCRIPTION` VARCHAR(255)) TYPE=MyISAM;
INSERT IGNORE INTO occupationalinfo (`ID`,`Title`,`DESCRIPTION`) VALUES
('1','Architects, Except Landscape and Marine','CODE: 22302 TITLE: Architects, Except Landscape and Marine DEFINITION: Plan and design structures, such as private residences, office buildings, theaters, factories, and other structural property. TASKS KNOWLEDGE SKILLS ABILITIES WORK ACTIVITIES WORK CONTEXT INTERESTS WORK VALUES CROSSWALKS TASKS: 1. Prepares information regarding design, structure specifications, materials, color, equipment, estimated costs, and construction time. 2. Plans layout of project. 3. Integrates engineering element into unified design.
I am still in the early learning phases of programming so any tips followed by explanations would be greatly appreciated
You can change the extension of the file from .txt to .sql. It's no mystery, they're both text files. As long as it contains SQL like you have above, you will be fine.
If you have command-line access to the server, upload the file separately and run it from the command line using mysql.
You can also try temporarily increasing the upload file-size limit, although 31 MB is really big.
If you have a modern installation of phpmyadmin with the PHP zip extension, you can change the file extension to .sql, zip it and then upload it. It should be radically smaller than 31 MB.
The most definite solution however is to break it up: go through your file and separate out each table, and each collection of say 500 insert statements. Save them as separate files, upload and run them in order in phpmyadmin.
Related
I have got the .sql file as result of backing up my whole website data. I have the SQL file with me now. All I want is to view the data as tables. The file is quite big and weighs 700 mb. I have MySQL software with me. When I try to open the file, it first ask whether to open or just run, saying it is quite big file to handle. Selecting run make the software immediately stuck and eventually report problem and close.
If I select open, after long time it open up showing many sentences of codes with insert, and all. If I choose to run the SQL from there, it again get stuck. So, it too become impossible to view the table. Is there any alternative way to view the SQL file as a table? Using any software or any online ways, to view the table.
I suggest you import your dump into a local database, then you will be able to navigate it and run queries against it.
I asked a question here a while back and using the answers, made some head way in figuring out how my DOS based legacy software works.
My problem: The software uses Btrieve to read/store data in .dbk files. I know this because the DDF files reference these dbk files. I found a number of ways to open btrieve data but only if they are stored in .btr files.
Anyone has any hints? I've spent considerable amount of time digging through resources but to no avail. All I need right now is to see the data stored in the dbk files in a readable format.
If your DDFs reference .DBK files, you should, using a version of Btrieve / Pervasive that supports it, be able to use ODBC to read the data.
Create the ODBC DSN pointing to your DDFs and Data Files.
Once created, use your favorite export tool to export the data to your favorite format.
I have a Huge mysql dump I need to import, I managed to split the 3gig file by table insert, one of the table inserts is 600MBs, I want to split it into 100 MB files. So my question is: is there a script or easy way to split a 600MB INSERT statement into multiple 100MB inserts without having to open the file (as this kills my pc).
I tried SQLDumpSplitter but this does not help.
here is the reason I cannot just run the 600MB file:
MYSQL import response 'killed'
Please help
On Linux, easiest way to split files is split -l N - split to pieces N lines each.
On Windows, I've had pretty good luck with HxD - it works well with huge files.
You can easily open a file of 1GB on Textpad software. User this software to open the file and split your queries as what you want.
Link for downloading TextPad software TextPad
I inherited the maintenance of a small web forum. Near as I can tell, it is powered by a MySQL database on the backend (the frontend is all PHP).
I need to extract some of the data (which also involves searching for the data I need to extract), but I don't want to touch the production database. I exported a database backup, which produced a several-hundred-megabyte .sql file.
What's the best way to mine these data? I can see several options:
grep through the .sql script in text mode, trying to extract the relevant data
Load it up in sqlite3 (I tried doing this, but it barfed on some of the statements in the script and didn't produce any tables. I have no database experience whatsoever though, so I haven't ruled it out as a dead end just yet).
Install MySQL on my home box, create a database, and execute the .sql script to recreate the data. Then just attach some database explorer tool.
Find some (Linux) app which can understand the .sql file natively (seems unlikely after a bit of Googling).
Any pointers to which of these options (or one I haven't thought of yet) would be the most productive?
I would say any option might work but for data mining, you definitely want to load it up in a new database so you can start query-ing the data and building reports on the data. I would load it up on your Home box. No need to have it remote.
I'm working on a membership site where users are able to upload a csv file containing sales data. The file will then be read, parsed, and the data will be charted. Which will allow me to dynamically create charts
My question is how to handle this csv upload? Should it be uploaded to folder and stored for later or should it be directly inserted into a MySQL table?
Depends on how much processing needs to be done, I'd say. if it's "short" data and processing is quick, then your upload-handling script should be able to take care of it.
If it's a large file and you'd rather not tie up the user's browser/session while the data's parsed, then do the upload-now-and-deal-with-it-later option.
It depends on how you think the users will use this site.
What do you estimate the size of the files for these users to be?
How often would they (if ever) upload a file twice, can they download the charts?
If the files are small and more for one-off use you could upload it and process it on the fly, if they require repetitive access and analysis then you will save the users time by importing the data to the database.
The LOAD DATA INFILE command in MySQL handles uploads like that really nice.If you make the table you want to upload it to and then use that command it has worked great and super quick for me. I've loaded several thousand rows of data in under 5 seconds using it.
http://dev.mysql.com/doc/refman/5.5/en/load-data.html