Uploading data from csv to MYSQL Server using MYSQLSH - mysql

I need to upload data from CSV to my MYSQL Server, I've used mysqlsh to do it using jobs:
"C:\Program Files (x86)\MySQL\MySQL Shell\bin\mysqlsh.exe" --sql -h x.x.x.x -u user -password -D database -e "LOAD DATA LOCAL INFILE 'file.csv' INTO TABLE table FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n' IGNORE 1 ROWS (field1, field2)
But when i execute the command i got this error:
The used command is not allowed with this MySQL version
I readed that i need to set local_infile to TRUE, i've made and i can't do it
What I'm doing wrong?

You need to enable local_infile option on client side, as well. The only way to do that in MySQL Shell is to pass that option as connection option.
mysqlsh.exe mysql://user#x.x.x.x/database?local-infile=1 -e "LOAD DATA..."
You can get more information about connection options by calling mysqlsh -i -e "\? connection".
If you want to load big input CSV file, you can use MySQL Shell's Parallel data import.

Related

How to import a CSV with Windows Task Scheduler or schedule a task using the command prompt

I could run the following command:
mysql.exe -u root
use testdb
LOAD DATA INFILE 'C:/Users/user1/test.csv' INTO TABLE demo FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
under c:\xampp\mysql\bin
How could I run the above command as a batch file at task scheduler or schedule task using command prompt ?
I just need to use mysql -h host -u user < batch-file
And store every command in batch-file.

Run a cron job to import many csv files in docker mysql container

My database in running inside a docker container and i want to load data from csv files into the database every hour
Inside the container runs this script perfect manually, but i can't start a ron job inside the mysql container.
#!/usr/bin/env bash
cd /import/logs/301
for f in *.csv
do
mysql -e "LOAD DATA LOCAL INFILE '"$f"' IGNORE INTO TABLE 301 FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 2 LINES (#date,#time,R3,R2,R1) SET MeasuringTime = timestamp(str_to_date(#date,'%Y/%m/%d'), #time)" -u USER --password=PASSWORD DATABASE
done
cd /import/logs/302
for f in *.csv
do
mysql -e "LOAD DATA LOCAL INFILE '"$f"' IGNORE INTO TABLE 302 FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 2 LINES (#date,#time,R3,R2,R1) SET MeasuringTime = timestamp(str_to_date(#date,'%Y/%m/%d'), #time)" -u USER --password=PASSWORD DATABASE
done
I prefer to run the cron on the host, but how?
Or should i take a extra import container, how would you build the docker-compose.yml
Running more than one process in a single docker container is usually a bad practice.
(you can read about it in docker documentation if you're really interested)
Option 1 - dedicated container for the cron job
Your best option is to create another container that will do this job (of course, you'll have to link between them and make sure you have mysql installed on your new docker container etc)
For the new container, check Jon Ribeiro's article on how to run cron jobs inside docker https://jonathas.com/scheduling-tasks-with-cron-on-docker/
Option 2 - running the cron job on the host
Alternatively, if you want to run cron on your host, you'll have to expose the mysql container ports to the host, and connect to to it.
Do note that mysql on a linux host will use unix socket if it tries to connect to localhost, so use mysql --host=localhost --protocol=TCP as explained in mysql documentation
https://dev.mysql.com/doc/refman/5.5/en/connecting.html
From mysql documentation:
On Unix, the client connects using a Unix socket file. The --socket option or the MYSQL_UNIX_PORT environment variable may be used to specify the socket name.
To force a TCP/IP connection to be used instead, specify a --protocol option:
shell> mysql --host=localhost --protocol=TCP

Load CSV file with bash script gives error although none are thrown when launching mysql command on terminal

I'm trying to load some csv files by calling mysql from the terminal without entering mysql interpreter.
I created the following function which I call when I'm ready to load all csv files mentioned in "$#"
function sqlConn {
sqlLoad="$sqlConnBase $# $dbName"
`"$sqlLoad"`
#I tried simply with $sqlLoad too but same problem occurs,
#although everything needed for the query is present in either
#$sqlLoad or "$sqlLoad"
}
sqlConnBase and dbName are global variables defined at the beginning of my bash script like this:
sqlConnBase="mysql -h localhost -u group8 --password=toto123"
dbName="cs322"
I call sqlConn like this:
sqlConn " --local-infile=1 < sqlLoadFile.sql"
the content of sqlLoadFile.sql is the following:
LOAD DATA LOCAL INFILE 'CSV/notes_rem.csv'
INTO TABLE Notes
CHARACTER SET UTF8
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\\'
LINES TERMINATED BY '\n' STARTING BY '';
The problem I get is the following:
./loadAll.bash: line 31: mysql -h localhost -u group8
--password=toto123 --local-infile=1 < sqlLoadFile.sql cs322: command not found
the strange thing is that when I simply execute
mysql -h localhost -u group8 --password=toto123
--local-infile=1 < sqlLoadFile.sql cs322
on my terminal it does populate my cs322 database, i.e. all the rows of my csv are present in my cs322 database.
What could be the source of the error in my script?
The mysql -h localhost ... is treated as a command and not just mysql where the rest is arguments.
You need to use eval instead of the backticks:
eval "$sqlLoad"
When that is said you should be really careful with escapes, word splitting and globbing, and the above approach should be avoided.
A recommended approach is to populate an array with arguments:
declare -a args
args+=("-h" "localhost")
args+=("-u" "group")
# ...
mysql "${args[#]}"

pass parameter from bash to mysql script

I'm trying to pass parameter from bash script to mysql script. The bash script is
#!/bin/bash
for file in `ls *.symbol`
do
path=/home/qz/$file
script='/home/qz/sqls/load_eval.sql'
mysql -u qz -h compute-0-10 -pabc -e "set #pred = '$path'; source $script;"
done
The load_eval.sql is
use biogrid;
load data local infile #pred into table lasp
fields terminated by ','
lines terminated by '\n'
(score, symbols);
When running the bash script, I got error the message:
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '#pred into table lasp ..
It seems the value of the parameter #pred is not passed into mysql script.
MySQL doesn't support session variables in a LOAD DATA INFILE statement like that. This has been recognized as a feature request for quite some time (http://bugs.mysql.com/bug.php?id=39115), but the feature has never been implemented.
I would recommend using mysqlimport instead of doing the complex steps with mysql that you're doing. The file's name must match the table's name, but you can trick this with a symbolic link:
#!/bin/bash
for file in *.symbol
do
path="/home/qz/$file"
ln -s -f "$path" /tmp/lasp.txt
mysqlimport -u qz -h compute-0-10 -pabc \
--local --columns "score,symbols" /tmp/lasp.txt
done
rm -f /tmp/lasp.txt
PS: No need use `ls`. As you can see above, filename expansion works fine.

Insert CSV to MySQL using Ubuntu Terminal via shell script

Is it possible to insert a CSV file into MySQL using a shell script in Ubuntu?
Here's what I tried :
mysql -uroot -proot mysfdb < /home/sf/data.csv
But I am given an error
ERROR 1064 (42000) at line 1: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near
Here's a sample content from the CSV file:
showinventory_SST312V8N4615041313_1366009574txt_,800-200002.A0,00007985
Any ideas?
Maksym Polshcha's answer is correct but is only missing a few things. Apparently, since it's a local file, I have to declare it as a local file in the mysql command. The final command should be something like this:
mysql -uroot -proot --local_infile=1 3parsfdb -e "LOAD DATA LOCAL INFILE '/logfiles/Bat_res.csv' INTO TABLE Bat_res FIELDS TERMINATED BY ','"
Also I made sure that the /logfiles directory and the Bat_res.csv are world readable.
Thank you for the great answers.
Try this:
mysql -uroot -proot mysfdb -e "LOAD DATA INFILE '/home/sf/data.csv' INTO TABLE mytable"
where mytable is your table for the data. If you have non-standard field/line separators in your CSV file use FIELDS TERMINATED BY and LINES TERMINATED BY
See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
I used this and it worked.
Login in mysql using `mysql -uroot -ppassword --local-infile`
Then in terminal:
LOAD DATA LOCAL INFILE '.csv path' INTO TABLE table_name FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
Open Ubuntu Terminal and just run the following command
# mysql -u admin -p --local_infile=1 DATABASE_NAME -e "LOAD DATA LOCAL INFILE 'students.csv' INTO TABLE TABLE_NAME FIELDS TERMINATED BY ',' enclosed by '\"'"
Here,
DATABASE_NAME = The name of your database
students.csv = The CSV file directory, that you want to upload in the database
TABLE_NAME = in which table you want to upload your data
admin = Database user name
After run this command system asked for the password of the admin user.
Write the password and Enjoy.