Difference between CSV import and CSV using LOAD DATA? - mysql

In phpMyAdmin there are two options to import a CSV file.
One is CSV. The other is CSV using LOAD DATA.
What's the difference between these two? Is there an advantage to using one over the other?

LOAD DATA INFILE is a MySQL query that works completely independently of PHPMyAdmin.
The CSV import probably involves uploading the file to the PHPMyAdmin server, where it parses the file and builds a series of INSERT statements to be run against the server.
Personally, I wouldn't trust anything PHPMyAdmin does ;-) - however, actual performance will probably depend on your table structure and the data.
I will note, however, that MySQL takes some very efficient shortcuts when inserting data from a LOAD DATA INFILE command.

As stated above the LOAD DATA option is actually telling phpMyAdmin to use the MySQL command to let MySQL parse and load the file rather than phpMyAdmin parsing it first.
As also stated above, giving MySQL access to load the file can be dangerous if you don't feel 100% secure about the source and accuracy of the file it's self. It's like using a php form with no sql injection protection to insert data.
However, in some cases phpMyAdmin does not format the data correctly or has trouble parsing it when the regular CSV" option is used. This will cause un-explained errors such as "invalid format on line N" or "incorrect field count on line N" Those might not be exact error messages since I'm not logged into phpMyAdmin at the moment. In these cases the LOAD DATA option can be used to get passed the error. I think the extra option of Use local keyword has to do with making sure the correct commands for that specific version of MySQL on the local server is used. Not sure about the last part though.
Something to keep in mind is also the size of the file (number of lines being imported) I have had to break down a 1600 line file into smaller files even when using the LOAD DATA option in order to get it to go through. It gave no errors but the "affected rows" was incorrect when the file was too big.

The first option will have phpMyAdmin parse the CSV file itself and then generate and execute the SQL to insert the data. The second option will let MySQL take care of loading, processing, and inserting the data.
Both options (should) behave the same way, but the LOAD DATA INFILE option is generally much faster, and you don't have to worry about PHP's memory/execution time limits. The only problem is that it isn't supported by all configurations because there are security implications for giving MySQL access to the uploaded files, and as such it is often disabled (ex. shared hosting).

To add to the other replies: the "CSV" one insists you have exactly the same amount of columns in the text file and the table. "CSV using LOAD DATA" does not.

CSV and CSV using LOAD DATA. The first method is implemented internally by phpMyAdmin and is the recommended one for its simplicity. With the second method, phpMyAdmin receives the file to be loaded, and passes it to MySQL. In theory, this method should be faster. However, it has more requirements due to MySQL itself

Related

exporting mysql table with bulk data in phpmyadmin

I have a mysql table with large amount of data. I need to export this table to another database with all the data. But when I try to export the table as sql file from phpmyadmin, it shows error
The webpage at https://i.p.adress/domains/databases/phpMyAdmin/export.php might be temporarily down or it may have moved permanently to a new web address.
I tried exporting as CSV also, but the same error happens.
Does it happens because my table contains large amount of data? Is there any other way to export this table with all data?
I have around 1346641 records.
Does export work with a smaller database? Does it process for some time before showing that error or is it displayed as soon as you confirm your export options? I don't recall having seen that error message relating to large exports, but I may be remembering incorrectly. Any hints in your webserver or PHP error logs? Which phpMyAdmin version are you using?
Regarding large exports:
Because of the nature of phpMyAdmin running as a PHP script on your server, as well as sending the downloaded file to you as a download, there are a number of limitations forced on it. Web servers are usually configured to keep PHP programs from running for very long, and a long download (or long time processing the export) can be affected. Additionally, memory and other resources are often limited in a similar manner. Therefore, it's usually better to use some other means of exporting large databases.
The command-line utility mysqldump, provided with the MySQL server package, is the definitive standard. If you have command line/shell access, it's best to use mysqldump to export the .sql file(s) and then copy those through any normal file-transfer protocol (FTP, SCP, SSH, etc).
That being said, phpMyAdmin has several enhancements and tweaks that can make this possible.
Use $cfg[SaveDir] to enable writing the exported file to disk on the server, which you can then copy through any normal file transfer protocol.
If you encounter timeouts or resource restrictions, you can edit the PHP configuration directives (the linked documentation refers to imports but the same restrictions apply to exports).
Experiment with the export compression setting, in particular using an uncompressed export format (exporting to SQL directly rather than a zipped archive) can work around some memory restrictions.

How to upload data from zip file to MySQL database using phpMyAdmin?

I have a client who got a zipped file that has all the database they had in the SaaS app they were using. Now, we have a similar app but our column names are different (obviously) and in some cases we have less columns. So, now i want to upload all this data to my database but i am not sure how to do it?
I run phpmyadmin on the servers.
Extract the file on your desktop.
Login to your phpMyAdmin account.
Click the Import tab.
Select the file to import, file format, ect. and click Go.
Browse through the structure of the imported database to the columns of interest. For each column, click the pencil icon to edit the column (i.e. rename it), or click the X icon to delete it.
To merge data sets, after importing the tables, you would need to run your own query in the SQL tab to merge the data sets.
That are two different tasks in one question,
phpMyAdmin is able to import ZIP-files directly – you don't need to extract them on your local machine. Also be aware of max upload sizes and maximum script execution times, when importing huge database dumps.
To map an existing database to another structure involves a lot of manual work, like renaming tables and columns and copying data from on table to another.I would suggest, you import the old/original database to some "working copy" database and have your new database separate. That way you can use MySQL-features (INSERT INTO new_db.YX … SELECT XY_a FROM old_db.XY) to copy the data where it should go.
Well first you need to take a look at the data files and see how the columns/tables differ. After you sort that out you can go about about figuring out how it insert the data. If the files are large and there are quite a few i wouldnt use phpmyadmin. I'd ssh into the box and use the command line client or set the DB up for remote access and use a local copy of the client.
If youre lucky you won't have to do any processing on the data and you just map values from the old columns to the new columns as part of you LOAD DATA INFILE statement. Whatever you do youll want to test all this on a dummy db(s) first before you go running it in a live environment.

Import of .CSV then export to Access 2007 DB via SSIS causes huge bloat

I've got a 90MB .csv extract I need to push into an Access 2007 DB.
AS A TEST - If I do a straight import of the file with default settings into a new table I end up with a DB of 134MB.
WHAT I NEED TO DO - The .csv extract contains a couple of columns I need to process before pushing them to into the Access DB. To accomplish this I am using SSIS (from SQL Server 2008 install) and using a couple of derived columns to contain the processed columns which then all get pushed into an existing Access table (which has no rows at the start of the process) via an ADO.NET Connection using the following connection string "Data Source=C:\Import\InTheGarden.accdb;Provider=Microsoft.ACE.OLEDB.12.0;". (Connection String from Connection in Connection Manager in SSIS)
When I process the data using SSIS I end up with a file of 1.16GB which when compacted comes down to a size of approximately 180MB!!!
Two things: -
Firstly I don't understand what is causing the bloat and how I might get around to it. I've read I don't know how many articles today on "Access 2007 Bloat" and can't seem to find anything that exlpains what exactly is going on:(
Secondly, whilst I can import the .csv file above I have another text file of 154MB which needs importing and given that the smaller file caused the DB to bloat to 1.16GB on import I'm not holding out much hope of success with this bigger file not exceeding the 2GB limit!
I would normally use SQL Server as a back end but there are reasons beyond my control as to why this can't happen:(
Happy to provide further information, Access is not something I use loads so there may be key information which I've missed!! :(
Thx!
Why not use a staging table in Access to do the import, process the data, and then insert to the real table? You'd use a separate temp database for that so as to avoid the bloat of importing twice. I've done dozens of imports into Access and this has always been the way I do it when I need to process the data before the final append.
Try breaking the CSV into several smaller files and running Access' Compact Database command after importing each one.

How to load Excel or CSV file into Firebird?

I'm using Firebird database and I need to load Excel file into a database table. I need a tool that does this well. I tried some I found on Google, but all of them have some bugs.
Since Excel data is not created by me, it would be good if it could scan the file and discover what kind of data is inside and suggest a table to be created in the database.
Also, it would be nice if I could compare the file against the data that is already in the database table, and I can pick which data to load and which not.
Tools that load CSV files are also fine, I can "Save as" CSV from Excel before loading.
Well, if you can use CSV, the I guess XMLWizard is the right tool for you. It can load a CSV file and compare with database data. And you can select the changes you wish to make to the table.
Don't let the name fool you, it does work with XML, but it also works very well with CSV files. And it can also estimate the column datatypes and offer CREATE TABLE statement for your file.
Have you tried FSQL?
It's a freeware very similar to Firebird's standard ISQL, but with some extra features, like import data from CSV files.
I've used it with DBF files and it worked fine.
There is also EMS Data import tool for Firebird and Interbase
http://www.sqlmanager.net/en/products/ibfb/dataimport
Not free, though, but it accepts a big variety of formats, including CSV and Excel.
EDIT
Another similar payware tool is Firebird Data Wizard http://www.sqlmaestro.com/products/firebird/datawizard/
There are some online tools which can help you to generate DDL/DML scripts from csv header/sample dump file, check out: http://www.convertcsv.com/csv-to-sql.htm
You can then use sql-workbench's Data Pumper or WbImport Tool from command line.
Orbada has GUI which support for importing csv file also.
DBeaver Free edition also support importing csv out of the box.
BULK INSERT
Other way is on Excell you build formula in new cells with data you want to export. The formula consists to format in strings and lenght to your field according lenght your field in firebird. So you can copy all this cells from excell and past on txt editor, so is possible to use the strategy of BULK INSERT in Firebird.
See more details in http://www.firebirdfaq.org/faq209/
The problem is if you have blob or null data to import, so see if you have this kind of values and if this way is to you.
If you have formated data in txt file, BULK INSERT will be quick way.
Hint: You can too to disable trigger and index associated with your table to accelerate BULK INSERT, and after enable them.
Roberto Novakosky
I load the excel file to lazarus spreadsheet and then export to firebird db. Everythong is fine and the only problem is fpspreadsheet will consider string field with numbers only as a number field. I can check the titles in the first row to see whether the excel file is valid or not.
As far as I can see all replies so far focus on tools that essentially read the Excel (or CSV) file and uses SQL inserts to insert the records into the Firebird database. While this works, I have always found this approach painstakingly slow.
That's why I created a tool that reads an Excel file and writes one file that has a (text) format suitable for Firebird external table (including support for UTF8 char columns) and one DDL file to create the external table in Firebird.
I then use regular SQL to select from the external table, cast as needed, and insert into whatever normal Firebird table I want. The performance with this approach is orders of magnitude faster than SQL inserts from a client app in my experience.
I would be willing to publish the tool. It's written in C#. Let me know if there's any interest.

How to open problematic data dump and convert it to mysql (or some other practical format)?

I'm trying to work with pretty interesting data set. Unfortunately, I have problems with opening it and converting it into any useful format. It's collection of archived txt files. When i decompress them and try to open txt file i get 'it's binary file, saving it may result in corrupt file' and it's unreadable - there are just 'weird characters', nothing i could manually read. Tried to use grep but it's also complaining that it's binary. I tried to import it into mysql database but when i tried to execute LOAD DATA LOCAL INFILE ‘/path/to/file.txt’ INTO TABLE tablename I got ERROR 1064 (42000). I don't know what to do to open it.
I'm going to take a wild guess here and say they're not really text files :-)
And until you figure what they are, trying to import them via different methods is not going to be the most productive course of action.
Can you post (a hex code of) headers for those files? If they have magic sequence in front, it should help with figuring out what they are.