Mysql LOAD DATA IN FILE discard file like Oracle Sql Loader does - mysql

I have to load several CSVs into some tables with Mysql LOAD DATA IN FILE, and I want to save discard records that could not be loaded (Because failed FKs, duplicates, etc.) in a discard file, such as Oracle SQL loader does
Any suggestion?
Thanks!

You can import data easily using MySQL WorkBench using import wizard through CSV. It has the option to import all the data and create new table. From there you can alter the tables later on or make necessary indexes as needed, or change dataTypes on the go.
Another option is to use LOAD DATA commands as usual. Import data to a newly created tables without the foreign keys. You can target csv columns to specific table columns as well. see https://dev.mysql.com/doc/refman/8.0/en/load-data.html

Related

Importing a MySQL Table from one server to another server

I need to import the table from One MySQL server to another MySQL server. The requirement looks pretty simple but i tried two ways and unable to import the data. Please find the below details and methods I tried to import the data.
We have an existing table in Server A with Engine Type as MyISAM, No primary key and contains duplicate records.
Now, We need to export this table into a new MySQL server (Server B) but the new MYSQL DB has some rules.
Engine should be InnoDB
Every table should contains a primary key.
The below ways i tried to import and failed.
Method 1:-
Exported the data from server A with Outfile command using MySQL Workbench
and tried to import with Infile command but due to the "mandatory primary key" validation all the
rows were not inserted. So to avoid this added an incremental column in new table and tried to
import but again it's failed due to the no.of columns mismatch error
Method 2:-
Configured both the servers in MySQL WorkBench and exported the table in Server A
with MANAGEMENT-> Data Export and tried to Import with MANAGEMENT-> Data Import/Restore
but due to the engines mismatch between the two tables again it's failed to import the data.
(Tried with Dump project folder and self contained folder).
AFAIK, Now I'm left with only one option (which I don't want to do due to the huge data).
Export the data as a CSV file and Import it by using Table Data Import Wizard.
Please guide me is there any other option to import the Data.
What you can try is to add the new auto inc column to your source table (and add the primary key also, to check if there's no problem). To make this complete you can also convert the table to the InnoDB engine and have it so prepared to be copied over with the least friction.

Importing data from Excel to MySQL database using SQLyog

Is there a way to automate importing of data from Excel/csv to MySQL database using SQLyog so that it automatically updates every time?
use Database -> Import -> Import External Data
Automatically each time what happens though ? It may be easier using LOAD DATA mysql command on a cronjob.
For an "update" by CSV check out
http://www.softwareprojects.com/resources/programming/t-how-to-use-mysql-fast-load-data-for-updates-1753.html
Step 1: Use LOAD DATA to import the entire input file into a temporary table and update primary key id's for any existing records we already have
Step 2: Use SELECT INTO OUTFILE to write all records we couldn't find an id for (new records) into a temporary text file
Step 3: Use LOAD DATA to add all new records (from step 2) into the target table
Step 4: Use a single UPDATE query, to update all records that match an existing primary key
Sqlyog is a standalone application and can't automate the importing process.
You can create a cronjob for LOAD DATA INFILE command to your server.

Import HDFS data file into mysql

I am trying to import a large HDFS file into a mysql db. The data in the file is delimiter by a '^A'. How do I tell mysql to separate each column by ctrl-A? Also, is it possible for me to specify what fields I want to import.
See the documentation here:
http://dev.mysql.com/doc/refman/5.5/en/mysqlimport.html
You are looking for the --fields-terminated-by=string option. There is not option to only select certain fields for import, though you can use --columns=column_list to map columns in your data to fields in the table.

Import multiple (100+) Excel files into MySQL Table

I have 100+ XLSX files that I need to get into a MySQL Database. Each file is a bit different, so I've created a massive table that contains a field for each possible column header across all files. This way they can auto-map on import.
Using Navicat I can import the files one at a time, but I'm wondering if there is a way to import all of the files at one time?
I'm thinking 'no' but if your going to do this a lot I think there are ways to automate this.
Export your xls-files into csv-files instead. Write some script to transform them between csv-excel style to csv-mysql style.
Create tables for importing with the csv engine. Put your files inplace of the ones created by mysql and flush tables. Now your data is ready to be read inside mysql and copied over to more powerful tables engines.
Another way is to do a VBA-script that exports the data in a format recognized by load data infile and then load them using mysql.

Can I import tab-separated files into MySQL without creating database tables first?

As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;