Regular transfer of .csv to database - mysql

I am working on a program that will regularly read the data from a .csv file and import it to my database. The csv is a copy from a database on another server so the table structure will be different when I upload to the new one.
What I am unsure of is the best method to do this on a nightly basis, and hopefully automate the process. Any suggestions?
The database is MySQL on an apache server

Consider using LOAD DATA INFILE query on a timed script with PHP, Python, or other language to upload into temp table:
LOAD DATA INFILE 'filename.csv'
INTO TABLE tempname
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
Then run an append query migrating the different structured temp table data into final table.
INSERT INTO finaltable (Col1, Col2, Col3,...)
SELECT [Col1], [Col2], [Col3], ... FROM tempname

Best solution in my opinion is create a PHP script to manipulate csv data and match the format of the file to the tables in your database. After which, you can set up a cron job(linux) or scheduled task(windows) to run the script automatically at your desired time and recursion. Hope this helps you.

Related

Load Data InFile SQL statement is not allowed to be executed when creating MySQL Events

I know that the Load Data InFile SQL statement is not allowed to be executed when creating MySQL Events.
I am trying to find an alternative solution to this, but until now I cant.
https://dev.mysql.com/doc/mysql-reslimits-excerpt/5.6/en/stored-program-restrictions.html
Anyone has any idea if there is any other way that we can set to load external file into the tables on a scheduled basis?
Any reason you cant just do the Load Data into a temporary file of the expected input format? Add a few extra columns such as for pre-processing flags if that might help? Then that could be a single load without any other interactions of triggers and such. Then you could have a stored procedure run query / fetch / process / insert, etc. based on the already loaded records into a table. If you skip some, so be it, others process as normal? Just something I have done in the past and might be an option for you.
although I did not quite understand your question if you are on a Linux based system, you could create a simple bash script to load any external data into MySQL . Either dump it locally and load it or pull from external source and load it the same way you would load any file.
For example as below I am trying to import my data from a csv file to my customers table and I want to schedule it every 5 minutes.
Hence I include the load data infile under a mysql event. Code as below
However, it returns an error: ERROR Code 1314: LOAD DATA is not allowed in stored procedures.
Any alternative way that i can import a csv file data into a table and execute it during a schedule of every 5 minutes?
USE sales;
CREATE EVENT import_file_event
ON SCHEDULE EVERY 5 MINUTE
DO
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Importfile.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(#first_name,#last_name,#gender,#email_address,#number_of_complaints)
SET first_name = NULLIF(#first_name,''),
last_name = NULLIF(#last_name,''),
gender = NULLIF(#gender,''),
email_address = NULLIF(#email_address,'');

Scheduling Mysql procedure/macro to load CSV data

As I'm beginner to mysql ,I'm asking this question. Please help me.
I had .csv file and i'm loading this file data into mysql table. using the following command
"load data infile 'D:/xampp/htdocs/test/test.csv' into table offers fields terminated by ',' enclosed by '"' lines terminated by '\n' ignore 1 rows; "
It is inserting data into data into table successfully.
Now my question as follows
test.csv file(it has a huge volume of data)is going to update for every 24 hours. So that I want a stored procedure/macro( whatever it may be) to load the updated data into offers table it is going to call for every 24 hours, So that table data is in sync with .csv file.
Steps to remember
I want to truncate the offers table data before insert into table
and load the data using above command
Create a success log status in another log table(optional)
I heared that "load data" not going to work in stored procedure (I don't exactly).please give me any answer/suggesstions.

How to generate MySQL databse tables script using mvc application with Entity Framework

As part of automation we want to create MySQL DB script from Existing DB. Actually we can't use EF Code Firts migration because tables are more than 1000. So timeout error will come. So the alternative is generate script from existing db using MVC application.
If you can export your tables data to .csv comma separated value files, you can do a easy migration with LOAD DATA INFILE. My experience is not from db to db. Just from excel to MySQL db. I exported excel sheets as a somedata.csv files. And then I created a table that I need to and I use this scripts to import to MySQL database. I think you can use it.
LOAD DATA INFILE 'somedata.csv' INTO TABLE some_table
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
So, first you need to create your tables' structure again.
Second, you need to export your data to csv files.
Third, you can do import specific data by using LOAD DATA INFILE.
It is not the entire great way. But It can be fast enough. A table with more than 1 millions record will imported with just a minute. You can see
LOAD DATA INFILE documentation here

Replace contents of MySQL table with contents of csv file on a remote server

I'm a newbie here trying to import some data into my wordpress database (MySQL) and I wonder if any of you SQL experts out there can help?
Database type: MySQL
Table name: wp_loans
I would like to completely replace the data in table wp_loans with the contents of file xyz.csv located on a remote server, for example https://www.mystagingserver.com/xyz.csv
All existing data in the table should be replaced with the contents of the CSV file.
The 1st row of the CSV file is the table headings so can be ignored.
I'd also like to automate the script to run daily at say 01:00 in the morning if possible.
UPDATE
Here is the SQL I'm using to try and replace the table contents:
LOAD DATA INFILE 'https://www.mystagingserver.com/xyz.csv'
REPLACE
INTO TABLE wp_loans
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
IGNORE 1 LINES
I would recommend a cron job to automate the process, and probably use BCP (bulk copy) to insert the data into a table... But seeing as you are using MySQL, instead of BCP, try load data in file - https://mariadb.com/kb/en/load-data-infile/

MySQL LOAD DATA LOCAL INFILE vs. SQL file

Every day we load around 6GB of CSV files into MySQL using:
LOAD DATA LOCAL INFILE 'file$i.csv' INTO TABLE tableName FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n';
We have 6 files that goes through this process, so it takes some time. Since we generate these files ourselves, we're in control of what format it gets outputted as.
Originally we chose CSV because this was a smaller process and we needed the data to be moved around and easily read by a non-developer. Now however, it's not so much of an issue since the loading time is so dramatic, we're now talking hours.
Is it quicker to output each row as an INSERT query into a single file and execute that or is CSV still quicker?
We're using the InnoDB storage engine.
If you use MyISAM tables, try ALTER TABLE table_name DISABLE KEYS; before loading the data and ALTER TABLE table_name ENABLE KEYS; after data import is done. This will greatly reduce your time taken for huge data.
Load data is faster than separate insert statement for each row.