How to convert csv into database table - mysql

Is there a way to import a csv into a SQL table, without having a previously-constructed table? I know how to import a csv into an existing table, but is there a way to create one from the csv?

You can do this using phpMyAdmin ,
(in this method csv file first row elements use as column names for the sql table)
1) select database
2) go to import tab and select csv file
3) ↓↓↓↓↓↓↓
4) after above steps new table will be created and if you want to change table names instead of having table1,table2
select table and go to operation tab :)
(phpMyAdmin 4.1.14)

I am no expert in MySQL but I don't believe there is such an import process. And there might not be in other database servers like Oracle, SQL Server, or PostgreSQL. In fact, it may not be a desirable automation as a table should be user defined and created to fit the database's relational model and for appropriate data types, indices, and keys.
Almost all SQL dialects require setting up the database table beforehand. If not, how would the system know beforehand you intended an integer or long number, a double or decimal number, a tinytext or longtext, which fields are to be indexed, or serve as primary key, and so on?
You might argue MS Access allows a CSV import with an optional table name. However, the ribbon wizard walks the user through setting up the field types, primary key, and table name. And going the non-wizard automation route, the DoCmd.TransferText method requires table name when using the acImportDelim argument.
So, your best process in MySQL may be LOAD DATA INFILE to run a bulk import of an external CSV into an existing table.

Related

Mysql LOAD DATA IN FILE discard file like Oracle Sql Loader does

I have to load several CSVs into some tables with Mysql LOAD DATA IN FILE, and I want to save discard records that could not be loaded (Because failed FKs, duplicates, etc.) in a discard file, such as Oracle SQL loader does
Any suggestion?
Thanks!
You can import data easily using MySQL WorkBench using import wizard through CSV. It has the option to import all the data and create new table. From there you can alter the tables later on or make necessary indexes as needed, or change dataTypes on the go.
Another option is to use LOAD DATA commands as usual. Import data to a newly created tables without the foreign keys. You can target csv columns to specific table columns as well. see https://dev.mysql.com/doc/refman/8.0/en/load-data.html

When to use CSV storage engine for MySQL?

From the docs, it states:
The CSV storage engine stores data in text files using comma-separated
values format.
What are the advantages of this? Here are some I can think of:
You can edit the CSV files using simple text editor (however, you can export data easily using SELECT INTO OUTFILE)
Can be easily imported into Spreadsheet programs
Lightweight and maybe better performance (wild guess)
What are some disadvantages?
No indexing
Cannot be partitioned
No transactions
Cannot have NULL values
Granted this (non-exhaustive) list of advantages and disadvantages, in what practical scenarios should I consider using the CSV storage engine over others?
I seldom use the CSV storage engine. One scenario I have found it useful, however, is for bulk data imports.
Create a table with columns matching my input CSV file.
Outside of mysql, just using a shell prompt, mv the CSV file into the MySQL data dictionary, overwriting the .csv file that belongs to my table I just created.
ALTER TABLE mytable ENGINE=InnoDB
Voilà! One-step import of a huge CSV data file using DDL instead of INSERT or LOAD DATA.
Granted, it's less flexible than INSERT or LOAD DATA, because you can't do NULLs or custom overrides of individual columns, or any "replace" or "ignore" features for handling duplicate values. But if you have an input file that is exactly what you want to import, it could make the import very easy.
This is a tad bit hacky, but as of MySQL 8, assuming you know the data structure beforehand and have permissions in the CSV-based schema directory, you can create the table definition in MySQL and then overwrite the generated CSV table file in the data directory with a symlink to the data file:
mysql --execute="CREATE TABLE TEST.CSV_TEST ( test_col VARCHAR(255) ) ENGINE=CSV;"
ln -sf /path/to/data.file /var/lib/mysql/TEST/CSV_TEST.CSV
An advantage here is that this completely obviates the need to run import operations (via LOAD DATA INFILE, etc.), as it allows MySQL to read directly from the symlinked file as if it were the table file.
Drawbacks beyond those inherent to the CSV engine:
table will contain header row if there is one (you'd need to filter it out from read operations)
table metadata in INFORMATION_SCHEMA will not update using this method, will just show the CREATE_TIME for which the initial DDL is run
Note this method is obviously more geared toward READ operations, though update/insert operations could be conducted on the command line using SELECT ... INTO OUTFILE and then copying onto/appending the source file.

Mysql select from table left join with csv export

I have tables that are on different mysql instances. I want to export some data as csv from a mysql instance, and perform a left join on a table with the exported csv data. How can I achieve this?
Quite surprisingly that is possible with MySQL, there are several steps that you need to go through.
First create a template table using CSV engine and desired table layout. This is the table into which you will import your CSV file. Use CREATE TABLE yourcsvtable (field1 INT NOT NULL, field2 INT NOT NULL) ENGINE=CSV for example. Please note that NULL values are not supported by CSV engine.
Perform you SELECT to extract the CSV file. E.g. SELECT * FROM anothertable INTO OUTFILE 'temp.csv' FIELDS TERMINATED BY ',';
Copy temp.csv into your target MySQL data directory as yourcsvtable.CSV. Location and exact name of this file depends on your MySQL setup. You cannot perform the SELECT in step 2 directly into this file as it is already open - you need to handle this in your script.
Use FLUSH TABLE yourcsvtable; to reload/import the CSV table.
Now you can execute your query against the CSV file as expected.
Depending on your data you need to ensure that the data is correctly enclosed by quotation marks or escaped - this needs to be taken into account in step 2.
CSV file can be created by MySQL on some another server or by some other application as long as it is well-formed.
If you export it as CSV, it's no longer SQL, it's just plain row data. Suggest you export as SQL, and import into the second database.

How to load column names, data from a text file into a MySQL table?

I have a dataset with a lot of columns I want to import into a MySQL database, so I want to be able to create tables without specifying the column headers by hand. Rather I want to supply a filename with the column labels in it to (presumably) the MySQL CREATE TABLE command. I'm using standard MySQL Query Browser tools in Ubuntu, but I didn't see in option for this in the create table dialog, nor could I figure out how to write a query to do this from the CREATE TABLE documentation page. But there must be a way...
A CREATE TABLE statement includes more than just column names
Table name*
Column names*
Column data types*
Column constraints, like NOT NULL
Column options, like DEFAULT, character set
Table constraints, like PRIMARY KEY* and FOREIGN KEY
Indexes
Table options, like storage engine, default character set
* mandatory
You can't get all this just from a list of column names. You should write the CREATE TABLE statement yourself.
Re your comment: Many software development frameworks support ways to declare tables without using SQL DDL. E.g. Hibernate uses XML files. YAML is supported by Rails ActiveRecord, PHP Doctrine and Perl's SQLFairy. There are probably other tools that use other format such as JSON, but I don't know one offhand.
But eventually, all these "simplified" interfaces are no less complex to learn as SQL, while failing to represent exactly what SQL does. See also The Law of Leaky Abstractions.
Check out SQLFairy, because that tool might already convert from files to SQL in a way that can help you. And FWIW MySQL Query Browser (or under its current name, MySQL Workbench) can read SQL files. So you probably don't have to copy & paste manually.

Can I import tab-separated files into MySQL without creating database tables first?

As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;