Import multiple CSV files into mysql - mysql

I have about 28 csv files and I need to import this into database as 28 tables.
I tried many things but just couldn't find a way.

You can link MYSQL directly to it and upload the information using the following SQL syntax.
load data local infile 'uniq.csv' into table tblUniq
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
Read more here : LOAD DATA INFILE
Import CSV file directly into MySQL

It's a good solution for Windows-users. Just create a text file "import.bat" with code, after that run it.
#ECHO off
FOR %%I In (db\*.sql) DO mysqlimport.exe --local -uroot -proot vipamoda %%I
PAUSE
More complex code which importing first SQL-structure, then import TXT-data:
#ECHO off
FOR %%I IN (db\*.sql) DO (
mysql.exe -uroot -proot vipamoda < %%~dpnI.sql
mysqlimport.exe --local -uroot -proot vipamoda %%~dpnI.txt
)
PAUSE
And dump code for this import code is:
mysqldump.exe --compact --add-drop-table --tab=db -uroot -proot vipamoda

Related

How to export a MySQL database to CSV

What is the best way to export a MySQL database to a CSV file without including indexes, table structures etc?
I just need to get all the data, I have a lot tables so I don't want to do it one by one.
I'm using 0xdbe and Workbench running on Linux.
Thanks!
mysqldump has a mode to dump tab-separated files, one per table.
mysqldump -u <username> -p<password> -T <output_directory> --no-create-info <database_name>
With a bit of tweaking this can be make to look like a CSV file.
mysqldump -u <username> -p<password> -T <output_directory> --fields-terminated-by ',' --fields-enclosed-by '"' --fields-escaped-by '\' --no-create-info <database_name>
Note that the file is written by the database, so whatever user your database is running as needs to have write access to the output directory!
This worked well for me:
mysqldump DBNAME TABLENAME --fields-terminated-by ',' \
--fields-enclosed-by '"' --fields-escaped-by '\' \
--no-create-info --tab /var/lib/mysql-files/
I'm dumping to /var/lib/mysql-files/ to avoid this error:
mysqldump: Got error: 1290: The MySQL server is running with the --secure-file-priv option so it cannot execute this statement when executing 'SELECT INTO OUTFILE'
mysqldump might be useful in this case. Even though it's not the csv output, it has all the indices and table structures including the data.
mysqldump -u root -p[root_password] [database_name] > dumpfilename.sql

mysqlimport - Import CSV File in MS Windows XAMPP emvironment

I am trying to import a CSV file from to a mysql database from the command line. This will be later incorporated into a Windows batch file.
mysqlimport -u user -puserpw --columns=ID,CID,Alerted --fields-terminated-by=',' --local School Customer.csv
All the data loads into the first column in the Customer table.
I want to correctly import data from the CSV to the appropriate column.
CSV Data format:
ID,CID,Alerted
1,CS,N
2,CS,N
3,CS,N
I would like to use mysqlimport since this will be easier to add to a Windows batch file.
How can I do this please help?
I guess you have to add --lines-terminated-by='\n' with mysqlimport. I run a quick test here and worked.
mysqlimport --local --ignore-lines=1 --fields-terminated-by=',' --lines-terminated-by='\n' db_name table_name.csv
I had to enclose the values in double quotes "
mysqlimport -u user -puserpw --columns=ID,CID,Alerted --ignore-lines=1 --fields-terminated-by="," --lines-terminated-by="\n" --local School Customer.csv
That fixed it.
Try to use the complete description for importing the csv file like :
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
Thanks

Mysql export issue?

I am using the following command to export thedabase,however i can't find the FILE.sql file after executing the command.Where is it stored?
mysqldump -u username -ppassword database_name > FILE.sql
Also how can i check my home directory , I have checked the program files(x86) mysql and respective bin folder in it.
The other way to export data to CSV file is by using "OUTFILE" syntax provided by MySQL like shown below.
[Usual MySQL Query]
INTO OUTFILE '/var/data_exports/huge_data.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Read practical usage and more here. ResolveBug.com
It should be at either the root dir or in current dir where you run the dump command.

Dump all tables in CSV format using 'mysqldump'

I need to dump all tables in MySQL in CSV format.
Is there a command using mysqldump to just output every row for every table in CSV format?
First, I can give you the answer for one table:
The trouble with all these INTO OUTFILE or --tab=tmpfile (and -T/path/to/directory) answers is that it requires running mysqldump on the same server as the MySQL server, and having those access rights.
My solution was simply to use mysql (not mysqldump) with the -B parameter, inline the SELECT statement with -e, then massage the ASCII output with sed, and wind up with CSV including a header field row:
Example:
mysql -B -u username -p password database -h dbhost -e "SELECT * FROM accounts;" \
| sed "s/\"/\"\"/g;s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g"
"id","login","password","folder","email"
"8","mariana","xxxxxxxxxx","mariana",""
"3","squaredesign","xxxxxxxxxxxxxxxxx","squaredesign","mkobylecki#squaredesign.com"
"4","miedziak","xxxxxxxxxx","miedziak","miedziak#mail.com"
"5","Sarko","xxxxxxxxx","Sarko",""
"6","Logitrans
Poland","xxxxxxxxxxxxxx","LogitransPoland",""
"7","Amos","xxxxxxxxxxxxxxxxxxxx","Amos",""
"9","Annabelle","xxxxxxxxxxxxxxxx","Annabelle",""
"11","Brandfathers and
Sons","xxxxxxxxxxxxxxxxx","BrandfathersAndSons",""
"12","Imagine
Group","xxxxxxxxxxxxxxxx","ImagineGroup",""
"13","EduSquare.pl","xxxxxxxxxxxxxxxxx","EduSquare.pl",""
"101","tmp","xxxxxxxxxxxxxxxxxxxxx","_","WOBC-14.squaredesign.atlassian.net#yoMama.com"
Add a > outfile.csv at the end of that one-liner, to get your CSV file for that table.
Next, get a list of all your tables with
mysql -u username -ppassword dbname -sN -e "SHOW TABLES;"
From there, it's only one more step to make a loop, for example, in the Bash shell to iterate over those tables:
for tb in $(mysql -u username -ppassword dbname -sN -e "SHOW TABLES;"); do
echo .....;
done
Between the do and ; done insert the long command I wrote in Part 1 above, but substitute your tablename with $tb instead.
This command will create two files in /path/to/directory table_name.sql and table_name.txt.
The SQL file will contain the table creation schema and the txt file will contain the records of the mytable table with fields delimited by a comma.
mysqldump -u username -p -t -T/path/to/directory dbname table_name --fields-terminated-by=','
If you are using MySQL or MariaDB, the easiest and performant way dump CSV for single table is -
SELECT customer_id, firstname, surname INTO OUTFILE '/exportdata/customers.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM customers;
Now you can use other techniques to repeat this command for multiple tables. See more details here:
https://mariadb.com/kb/en/the-mariadb-library/select-into-outfile/
https://dev.mysql.com/doc/refman/5.7/en/select-into.html
mysqldump has options for CSV formatting:
--fields-terminated-by=name
Fields in the output file are terminated by the given
--lines-terminated-by=name
Lines in the output file are terminated by the given
The name should contain one of the following:
`--fields-terminated-by`
\t or "\""
`--fields-enclosed-by=name`
Fields in the output file are enclosed by the given
and
--lines-terminated-by
\r
\n
\r\n
Naturally you should mysqldump each table individually.
I suggest you gather all table names in a text file. Then, iterate through all tables running mysqldump. Here is a script that will dump and gzip 10 tables at a time:
MYSQL_USER=root
MYSQL_PASS=rootpassword
MYSQL_CONN="-u${MYSQL_USER} -p${MYSQL_PASS}"
SQLSTMT="SELECT CONCAT(table_schema,'.',table_name)"
SQLSTMT="${SQLSTMT} FROM information_schema.tables WHERE table_schema NOT IN "
SQLSTMT="${SQLSTMT} ('information_schema','performance_schema','mysql')"
mysql ${MYSQL_CONN} -ANe"${SQLSTMT}" > /tmp/DBTB.txt
COMMIT_COUNT=0
COMMIT_LIMIT=10
TARGET_FOLDER=/path/to/csv/files
for DBTB in `cat /tmp/DBTB.txt`
do
DB=`echo "${DBTB}" | sed 's/\./ /g' | awk '{print $1}'`
TB=`echo "${DBTB}" | sed 's/\./ /g' | awk '{print $2}'`
DUMPFILE=${DB}-${TB}.csv.gz
mysqldump ${MYSQL_CONN} -T ${TARGET_FOLDER} --fields-terminated-by="," --fields-enclosed-by="\"" --lines-terminated-by="\r\n" ${DB} ${TB} | gzip > ${DUMPFILE}
(( COMMIT_COUNT++ ))
if [ ${COMMIT_COUNT} -eq ${COMMIT_LIMIT} ]
then
COMMIT_COUNT=0
wait
fi
done
if [ ${COMMIT_COUNT} -gt 0 ]
then
wait
fi
This worked well for me:
mysqldump <DBNAME> --fields-terminated-by ',' \
--fields-enclosed-by '"' --fields-escaped-by '\' \
--no-create-info --tab /var/lib/mysql-files/
Or if you want to only dump a specific table:
mysqldump <DBNAME> <TABLENAME> --fields-terminated-by ',' \
--fields-enclosed-by '"' --fields-escaped-by '\' \
--no-create-info --tab /var/lib/mysql-files/
I'm dumping to /var/lib/mysql-files/ to avoid this error:
mysqldump: Got error: 1290: The MySQL server is running with the --secure-file-priv option so it cannot execute this statement when executing 'SELECT INTO OUTFILE'
It looks like others had this problem also, and there is a simple Python script now, for converting output of mysqldump into CSV files.
wget https://raw.githubusercontent.com/jamesmishra/mysqldump-to-csv/master/mysqldump_to_csv.py
mysqldump -u username -p --host=rdshostname database table | python mysqldump_to_csv.py > table.csv
You also can do it using Data Export tool in dbForge Studio for MySQL.
It will allow you to select some or all tables and export them into CSV format.

Dump a mysql database to a plaintext (CSV) backup from the command line

I'd like to avoid mysqldump since that outputs in a form that is only convenient for mysql to read. CSV seems more universal (one file per table is fine). But if there are advantages to mysqldump, I'm all ears. Also, I'd like something I can run from the command line (linux). If that's a mysql script, pointers to how to make such a thing would be helpful.
If you can cope with table-at-a-time, and your data is not binary, use the -B option to the mysql command. With this option it'll generate TSV (tab separated) files which can import into Excel, etc, quite easily:
% echo 'SELECT * FROM table' | mysql -B -uxxx -pyyy database
Alternatively, if you've got direct access to the server's file system, use SELECT INTO OUTFILE which can generate real CSV files:
SELECT * INTO OUTFILE 'table.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM table
In MySQL itself, you can specify CSV output like:
SELECT order_id,product_name,qty
FROM orders
INTO OUTFILE '/tmp/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
From http://www.tech-recipes.com/rx/1475/save-mysql-query-results-into-a-text-or-csv-file/
You can dump a whole database in one go with mysqldump's --tab option. You supply a directory path and it creates one .sql file with the CREATE TABLE DROP IF EXISTS syntax and a .txt file with the contents, tab separated. To create comma separated files you could use the following:
mysqldump --password --fields-optionally-enclosed-by='"' --fields-terminated-by=',' --tab /tmp/path_to_dump/ database_name
That path needs to be writable by both the mysql user and the user running the command, so for simplicity I recommend chmod 777 /tmp/path_to_dump/ first.
The select into outfile option wouldn't work for me but the below roundabout way of piping tab-delimited file through SED did:
mysql -uusername -ppassword -e "SELECT * from tablename" dbname | sed 's/\t/","/g;s/^/"/;s/$/"/' > /path/to/file/filename.csv
Here is the simplest command for it
mysql -h<hostname> -u<username> -p<password> -e 'select * from databaseName.tableNaame' | sed 's/\t/,/g' > output.csv
If there is a comma in the column value then we can generate .tsv instead of .csv with the following command
mysql -h<hostname> -u<username> -p<password> -e 'select * from databaseName.tableNaame' > output.csv
If you really need a "Backup" then you also need database schema, like table definitions, view definitions, store procedures and so on. A backup of a database isn't just the data.
The value of the mysqldump format for backup is specifically that it is very EASY to use it to restore mysql databases. A backup that isn't easily restored is far less useful. If you are looking for a method to reliably backup mysql data to so you can restore to a mysql server then I think you should stick with the mysqldump tool.
Mysql is free and runs on many different platforms. Setting up a new mysql server that I can restore to is simple. I am not at all worried about not being able to setup mysql so I can do a restore.
I would be far more worried about a custom backup/restore based on a fragile format like csv/tsv failing. Are you sure that all your quotes, commas, or tabs that are in your data would get escaped correctly and then parsed correctly by your restore tool?
If you are looking for a method to extract the data then see several in the other answers.
You can use below script to get the output to csv files. One file per table with headers.
for tn in `mysql --batch --skip-page --skip-column-name --raw -uuser -ppassword -e"show tables from mydb"`
do
mysql -uuser -ppassword mydb -B -e "select * from \`$tn\`;" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > $tn.csv
done
user is your user name, password is the password if you don't want to keep typing the password for each table and mydb is the database name.
Explanation of the script: The first expression in sed, will replace the tabs with "," so you have fields enclosed in double quotes and separated by commas. The second one insert double quote in the beginning and the third one insert double quote at the end. And the final one takes care of the \n.
If you want to dump the entire db as csv
#!/bin/bash
host=hostname
uname=username
pass=password
port=portnr
db=db_name
s3_url=s3://buckera/db_dump/
DATE=`date +%Y%m%d`
rm -rf $DATE
echo 'show tables' | mysql -B -h${host} -u${uname} -p${pass} -P${port} ${db} > tables.txt
awk 'NR>1' tables.txt > tables_new.txt
while IFS= read -r line
do
mkdir -p $DATE/$line
echo "select * from $line" | mysql -B -h"${host}" -u"${uname}" -p"${pass}" -P"${port}" "${db}" > $DATE/$line/dump.tsv
done < tables_new.txt
touch $DATE/$DATE.fin
rm -rf tables_new.txt tables.txt
Check out mk-parallel-dump which is part of the ever-useful maatkit suite of tools. This can dump comma-separated files with the --csv option.
This can do your whole db without specifying individual tables, and you can specify groups of tables in a backupset table.
Note that it also dumps table definitions, views and triggers into separate files. In addition providing a complete backup in a more universally accessible form, it also immediately restorable with mk-parallel-restore
Two line PowerShell answer:
# Store in variable
$Global:csv = (mysql -uroot -p -hlocalhost -Ddatabase_name -B -e "SELECT * FROM some_table") `
| ConvertFrom-Csv -Delimiter "`t"
# Out to csv
$Global:csv | Export-Csv "C:\temp\file.csv" -NoTypeInformation
Boom-bata-boom
-D = the name of your database
-e = query
-B = tab-delimited
There's a slightly simpler way to get all the tables into tab delimited fast:
#!/bin/bash
tablenames=$(mysql your_database -e "show tables;" -B |sed "1d")
IFS=$'\n'
tables=($tablenames)
for table in ${tables[#]}; do
mysql your_database -e "select * from ${table}" -B > "${table}.tsv"
done
Here's a basic python script that does the work! You can also choose to export only the headers (column names) or headers & data both.
Just change the database credentials and run the script. It will output all the data to the output folder.
To run the script -
Run: pip install mysql-connector-python
Change database credentials in the "INPUT" section
Run: python filename.py
import mysql.connector
from pathlib import Path
import csv
#========INPUT===========
databaseHost=""
databaseUsername=""
databasePassword=""
databaseName=""
outputDirectory="./WITH-DATA/"
exportTableData=True #MAKING THIS FIELD FALSE WILL STORE ONLY THE TABLE HEADERS (COLUMN NAMES) IN THE CSV FILE
#========INPUT END===========
Path(outputDirectory).mkdir(parents=True, exist_ok=True)
mydb = mysql.connector.connect(
host=databaseHost,
user=databaseUsername,
password=databasePassword
)
mycursor = mydb.cursor()
mycursor.execute("USE "+databaseName)
mycursor.execute("SHOW TABLES")
tables = mycursor.fetchall()
tableNames=[table[0] for table in tables]
print("================================")
print("Total number of tables: "+ str(len(tableNames)))
print(tableNames)
print("================================")
for tableName in tableNames:
print("================================")
print("Processing: "+ str(tableName))
mydb = mysql.connector.connect(
host=databaseHost,
user=databaseUsername,
password=databasePassword
)
mycursor = mydb.cursor()
mycursor.execute("USE "+databaseName)
if exportTableData:
mycursor.execute("SELECT * FROM "+tableName)
else:
mycursor.execute("SELECT * FROM "+tableName+" LIMIT 1")
print(mycursor.column_names)
with open(outputDirectory+tableName+".csv", 'w', newline='') as csvfile:
csvwriter = csv.writer(csvfile)
csvwriter.writerow(mycursor.column_names)
if exportTableData:
myresult = mycursor.fetchall()
csvwriter.writerows(myresult)