Mysql looping over a list of files in directory - mysql

I want to insert files in my folder into mysql using the query below.
Structure of directory
1/
--first.csv
--second_123.csv
--second_124.csv
--second_125.csv
2/
--first.csv
--second_223.csv
--second_224.csv
--second_225.csv
LOAD DATA INFILE '/1/first.csv'
INTO TABLE tabel_name
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
How can I loop over the files to insert all the files. I want to write it in .sql file

When you want to determine the file names with code, you can't. A database is not meant to be able to scan filesystems and so on. If it could, I'd have some serious security concerns. There is the system command in the mysql client, that let's you do things. But that's just for convenience. You can't parse the result of it.

Related

LOAD DATA INFILE in phpMyAdmin 4.2.10.1 don’t work #7890

i'm really disappointed that only one person have some idea how to solve my problem!!!
I already have try to find thoroughly (2 hours!) the answer in all the other discussion, but nothing compare to my challenge!
I work with a hosted phpmyadmin installation with Win 10 and I try to load a file in a table.
When I start the import function of phpmyadmin, the sql-script will be found and started, but I get allways the
error #7890 – can’t find file ‘D:\Loader\doshas.txt’.
'load.sql':
LOAD DATA LOCAL INFILE “D:\Loader\doshas.txt” INTO TABLE doshas terminated by ‘,’;
The file 'load.sql' is in the same folder as doshas.txt!
What’s wrong?
Regards and thanks
Probably you haven't specified special characters.
I believe better way is to use LOAD DATA INFILE.
Let's say you want to load the file doshas.txt to my_table with columns col_a, col_b, col_c.
Put the txt file in the path of mysql of phpadmin ex.For XAMPP is xamp\mysql\data
Then in phpmyadmin -> code SQL write
LOAD DATA INFILE 'doshas.txt' INTO TABLE `my_table`
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(col_a, col_b, col_c);

Scheduling Mysql procedure/macro to load CSV data

As I'm beginner to mysql ,I'm asking this question. Please help me.
I had .csv file and i'm loading this file data into mysql table. using the following command
"load data infile 'D:/xampp/htdocs/test/test.csv' into table offers fields terminated by ',' enclosed by '"' lines terminated by '\n' ignore 1 rows; "
It is inserting data into data into table successfully.
Now my question as follows
test.csv file(it has a huge volume of data)is going to update for every 24 hours. So that I want a stored procedure/macro( whatever it may be) to load the updated data into offers table it is going to call for every 24 hours, So that table data is in sync with .csv file.
Steps to remember
I want to truncate the offers table data before insert into table
and load the data using above command
Create a success log status in another log table(optional)
I heared that "load data" not going to work in stored procedure (I don't exactly).please give me any answer/suggesstions.

Regular transfer of .csv to database

I am working on a program that will regularly read the data from a .csv file and import it to my database. The csv is a copy from a database on another server so the table structure will be different when I upload to the new one.
What I am unsure of is the best method to do this on a nightly basis, and hopefully automate the process. Any suggestions?
The database is MySQL on an apache server
Consider using LOAD DATA INFILE query on a timed script with PHP, Python, or other language to upload into temp table:
LOAD DATA INFILE 'filename.csv'
INTO TABLE tempname
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
Then run an append query migrating the different structured temp table data into final table.
INSERT INTO finaltable (Col1, Col2, Col3,...)
SELECT [Col1], [Col2], [Col3], ... FROM tempname
Best solution in my opinion is create a PHP script to manipulate csv data and match the format of the file to the tables in your database. After which, you can set up a cron job(linux) or scheduled task(windows) to run the script automatically at your desired time and recursion. Hope this helps you.

Replace contents of MySQL table with contents of csv file on a remote server

I'm a newbie here trying to import some data into my wordpress database (MySQL) and I wonder if any of you SQL experts out there can help?
Database type: MySQL
Table name: wp_loans
I would like to completely replace the data in table wp_loans with the contents of file xyz.csv located on a remote server, for example https://www.mystagingserver.com/xyz.csv
All existing data in the table should be replaced with the contents of the CSV file.
The 1st row of the CSV file is the table headings so can be ignored.
I'd also like to automate the script to run daily at say 01:00 in the morning if possible.
UPDATE
Here is the SQL I'm using to try and replace the table contents:
LOAD DATA INFILE 'https://www.mystagingserver.com/xyz.csv'
REPLACE
INTO TABLE wp_loans
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
IGNORE 1 LINES
I would recommend a cron job to automate the process, and probably use BCP (bulk copy) to insert the data into a table... But seeing as you are using MySQL, instead of BCP, try load data in file - https://mariadb.com/kb/en/load-data-infile/

Easy way to import rows into MySQL from csv file

I have a csv file that corresponds to what our user table looks like. Essentially we need to do a bulk import, what would be a good way to do this in MySQL without writing a custom script that just issues an SQL insert?
You could use load data infile
IE:
load data local infile 'uniq.csv' into table tblUniq
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
you can use phpmyadmin
which is the best tools for developers to do every thing easyly
here you have to choose the file click csv and just lload it
very easy............right
you can download it from here
http://www.phpmyadmin.net/home_page/downloads.php ( also install apache along with this )
you can also use wamp server which include phpmyadmin (best for windows)