Is there a way to automate importing of data from Excel/csv to MySQL database using SQLyog so that it automatically updates every time?
use Database -> Import -> Import External Data
Automatically each time what happens though ? It may be easier using LOAD DATA mysql command on a cronjob.
For an "update" by CSV check out
http://www.softwareprojects.com/resources/programming/t-how-to-use-mysql-fast-load-data-for-updates-1753.html
Step 1: Use LOAD DATA to import the entire input file into a temporary table and update primary key id's for any existing records we already have
Step 2: Use SELECT INTO OUTFILE to write all records we couldn't find an id for (new records) into a temporary text file
Step 3: Use LOAD DATA to add all new records (from step 2) into the target table
Step 4: Use a single UPDATE query, to update all records that match an existing primary key
Sqlyog is a standalone application and can't automate the importing process.
You can create a cronjob for LOAD DATA INFILE command to your server.
Related
I have to load several CSVs into some tables with Mysql LOAD DATA IN FILE, and I want to save discard records that could not be loaded (Because failed FKs, duplicates, etc.) in a discard file, such as Oracle SQL loader does
Any suggestion?
Thanks!
You can import data easily using MySQL WorkBench using import wizard through CSV. It has the option to import all the data and create new table. From there you can alter the tables later on or make necessary indexes as needed, or change dataTypes on the go.
Another option is to use LOAD DATA commands as usual. Import data to a newly created tables without the foreign keys. You can target csv columns to specific table columns as well. see https://dev.mysql.com/doc/refman/8.0/en/load-data.html
I cannot find a function that reverse engineers a mysql create AND insert script and places the inserts into the tables created inside the model.
When I go to Import->Reverse Engineer MySQL Create Script it does not place the insert statements into the table inserts.
The only option I found that was similar was import records from an external file, but it defaults to CSV and if I change the file type to All Files it does nothing when I open a .sql file with multiple inserts inside of it.
is there any way that I can upload an xlsx-file to my mysql database automatically every 12 hours?
I have an xlsx-file with around 600 rows. The target table already exists.
I would like to perform the following steps:
1. Delete the content of the existing table.
2. Insert the data from the xlsx-file.
This should be performed every 12 hours. Is there a way doing this without using php?
Thanks in advance.
Yes. You can use LOAD DATA LOCAL INFILE provided that the file is in CSV foremat else convert the file to CSV format.
Delete the content of the existing table.
Before you do so take a backup of the table. You can create a backup intermediary table and insert the data there.
Insert the data from the xlsx-file.
use LOAD DATA INFILE and import the data.
This should be performed every 12 hours.
You can create a SQL script with all this steps. Create a scheduled task (Windows) which runs every 12 hour.
You can do it using Data Import tool in dbForge Studio for MySQL (command-line mode).
How to:
Create data-import template file: open Data Import master, select target table, check Repopulate mode (delete all + inserts), and save template file.
Use created template to import your file in command-line mode. Use Windows Scheduled Tasks to run it periodically.
I have a MySQL Server which has one database called "Backup".
It only has one table with the name "storage".
In the Backup db the storage table contains about 5 Millions datarows.
Now I wanted to append new rows to the table by using the "source" command in the SQL command line.
So what happend is, that source uploaded all the new files in the table, but it overwrote the existing entries (seems that he first deleted all data)
What I have to say is that the sql file that I want to update comes from another server where this table has the same name and structure as "storage".
What I want is to append the new entries that are in the sql file to the one in my datebase. I do not want to overwrite them.
The structure in the two tables is exactly the same. I use the Backup datebase as the name says for backup uses, so that from time to time I can backup my data.
Has anyone an idea how to solve this?
Look in the .sql file you're reading with the SOURCE command, and remove the DROP TABLE and CREATE TABLE statements that appear there. They are the cause of your table being overwritten; what's actually happening is that the table is being replaced.
You could also look into using SELECT ... INTO OUTFILE and LOAD DATA INFILE as a faster and less potentially destructive way to get data from one server to the other in a file.
I am trying to update one of my SQL tables with new columns in my source CSV file. The CSV records in this file are already in this SQL table, but this SQL table is lacking some of the new columns from this CSV file.
I already added the new columns to my SQL table structure via ALTER TABLE. But now I just need to import the data from this CSV file into the new columns. How can I do this? I am trying to use SSIS and SQL Server to accomplish this, but am pretty new to Excel.
This is probably too late to solve salvationishere's problem; though I'm posting this for future readers!
You could just generate the SQL INSERT/UPDATE/etc command by parsing the csv file (a simple python script will do).
You could alternatively use this online parser:
http://www.convertcsv.com/csv-to-sql.htm
(Hoping that it'd still be available when you click!)
to generate your SQL command. The interface is extremely straight forward and it does the entire job in an awesome way.
You have several options:
If you are loading the data into a non-production system where you can edit the target tables, you could load the data into a new table, rename the old table to obsolete, and rename the new table to the old table name.
You can load the data into a staging table and then write a SQL statement to update the target table from the staging table.
You can open the CSV file in Excel and write a formula to generate an update script, drag the formula down across all rows so that you get a separate update statement for each row, and then run the separate update statements in management studio.
You can truncate the target table and update your existing ssis package that imports the file to use the new columns if you have the full history in your CSV file.
There are more options, but any of the above would probably be more than adequate solutions.