As I'm beginner to mysql ,I'm asking this question. Please help me.
I had .csv file and i'm loading this file data into mysql table. using the following command
"load data infile 'D:/xampp/htdocs/test/test.csv' into table offers fields terminated by ',' enclosed by '"' lines terminated by '\n' ignore 1 rows; "
It is inserting data into data into table successfully.
Now my question as follows
test.csv file(it has a huge volume of data)is going to update for every 24 hours. So that I want a stored procedure/macro( whatever it may be) to load the updated data into offers table it is going to call for every 24 hours, So that table data is in sync with .csv file.
Steps to remember
I want to truncate the offers table data before insert into table
and load the data using above command
Create a success log status in another log table(optional)
I heared that "load data" not going to work in stored procedure (I don't exactly).please give me any answer/suggesstions.
Related
I know that the Load Data InFile SQL statement is not allowed to be executed when creating MySQL Events.
I am trying to find an alternative solution to this, but until now I cant.
https://dev.mysql.com/doc/mysql-reslimits-excerpt/5.6/en/stored-program-restrictions.html
Anyone has any idea if there is any other way that we can set to load external file into the tables on a scheduled basis?
Any reason you cant just do the Load Data into a temporary file of the expected input format? Add a few extra columns such as for pre-processing flags if that might help? Then that could be a single load without any other interactions of triggers and such. Then you could have a stored procedure run query / fetch / process / insert, etc. based on the already loaded records into a table. If you skip some, so be it, others process as normal? Just something I have done in the past and might be an option for you.
although I did not quite understand your question if you are on a Linux based system, you could create a simple bash script to load any external data into MySQL . Either dump it locally and load it or pull from external source and load it the same way you would load any file.
For example as below I am trying to import my data from a csv file to my customers table and I want to schedule it every 5 minutes.
Hence I include the load data infile under a mysql event. Code as below
However, it returns an error: ERROR Code 1314: LOAD DATA is not allowed in stored procedures.
Any alternative way that i can import a csv file data into a table and execute it during a schedule of every 5 minutes?
USE sales;
CREATE EVENT import_file_event
ON SCHEDULE EVERY 5 MINUTE
DO
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Importfile.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(#first_name,#last_name,#gender,#email_address,#number_of_complaints)
SET first_name = NULLIF(#first_name,''),
last_name = NULLIF(#last_name,''),
gender = NULLIF(#gender,''),
email_address = NULLIF(#email_address,'');
I am working on a program that will regularly read the data from a .csv file and import it to my database. The csv is a copy from a database on another server so the table structure will be different when I upload to the new one.
What I am unsure of is the best method to do this on a nightly basis, and hopefully automate the process. Any suggestions?
The database is MySQL on an apache server
Consider using LOAD DATA INFILE query on a timed script with PHP, Python, or other language to upload into temp table:
LOAD DATA INFILE 'filename.csv'
INTO TABLE tempname
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
Then run an append query migrating the different structured temp table data into final table.
INSERT INTO finaltable (Col1, Col2, Col3,...)
SELECT [Col1], [Col2], [Col3], ... FROM tempname
Best solution in my opinion is create a PHP script to manipulate csv data and match the format of the file to the tables in your database. After which, you can set up a cron job(linux) or scheduled task(windows) to run the script automatically at your desired time and recursion. Hope this helps you.
I'm a newbie here trying to import some data into my wordpress database (MySQL) and I wonder if any of you SQL experts out there can help?
Database type: MySQL
Table name: wp_loans
I would like to completely replace the data in table wp_loans with the contents of file xyz.csv located on a remote server, for example https://www.mystagingserver.com/xyz.csv
All existing data in the table should be replaced with the contents of the CSV file.
The 1st row of the CSV file is the table headings so can be ignored.
I'd also like to automate the script to run daily at say 01:00 in the morning if possible.
UPDATE
Here is the SQL I'm using to try and replace the table contents:
LOAD DATA INFILE 'https://www.mystagingserver.com/xyz.csv'
REPLACE
INTO TABLE wp_loans
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
IGNORE 1 LINES
I would recommend a cron job to automate the process, and probably use BCP (bulk copy) to insert the data into a table... But seeing as you are using MySQL, instead of BCP, try load data in file - https://mariadb.com/kb/en/load-data-infile/
Every day we load around 6GB of CSV files into MySQL using:
LOAD DATA LOCAL INFILE 'file$i.csv' INTO TABLE tableName FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n';
We have 6 files that goes through this process, so it takes some time. Since we generate these files ourselves, we're in control of what format it gets outputted as.
Originally we chose CSV because this was a smaller process and we needed the data to be moved around and easily read by a non-developer. Now however, it's not so much of an issue since the loading time is so dramatic, we're now talking hours.
Is it quicker to output each row as an INSERT query into a single file and execute that or is CSV still quicker?
We're using the InnoDB storage engine.
If you use MyISAM tables, try ALTER TABLE table_name DISABLE KEYS; before loading the data and ALTER TABLE table_name ENABLE KEYS; after data import is done. This will greatly reduce your time taken for huge data.
Load data is faster than separate insert statement for each row.
I am importing a csv file into a database table using 'LOAD DATA LOCAL INFILE'. The complete query for curiosity is:
LOAD DATA LOCAL INFILE '".addslashes($current_file)."' REPLACE INTO TABLE $current_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '" . '"' . "' ESCAPED BY " . "'\\\\'" . " LINES TERMINATED BY '\n' IGNORE 1 LINES ($toVars) SET $setCols
Now, what that is doing in a nutshell is loading csv rows into table_A. What I want to do is create a memory table that will log every single row that is ever entered into table_A. I am going to achieve this using MYSQL Triggers. Now I am familiar with using Triggers with inserts that are performed. By my question is; how to make a AFTER trigger for a LOAD DATA LOCAL INFILE command?
The trigger should fire only when the LOAD DATA LOCAL INFILE query has completed everything. It shouldn't fire on each incremental insert or whatever the case may be.
Do you have a table that tracks these bulk uploads, like who ran them, how long they took, etc?
If so, you can use it as a sentinel and put the trigger there to do the bulk operations on the freshly loaded data in the other table.
Make sense?