I have about 3 or so weekly generated .CSV files that are automatically placed in a particular directory on my MySQL box.
What I would like to do is run some sort of automated script that opens the MySQL command line and executes a query to delete all records in the corresponding table and LOAD DATA INFILE from the CSV's.
Being that I'm on Windows I cannot do a chronjob, though I was thinking I could do some sort of batch script and run it as a Scheduled Task.
Any idea how I would go about doing this?
Depending on your MySQL version you could use CREATE EVENT to schedule some database statements, check:
http://dev.mysql.com/doc/refman/5.1/en/create-event.html
Related
I am trying to automate the upload of data to a mysql database. I am using MySql Workbench on a windows pc to remotely access my database on AWS. I have a sql file that I use to load a csv file into the db using LOAD DATA LOCAL INFILE. My csv file is created daily using a scheduled task and I would like to load it into the db using a batch type file and a scheduled task.
Is it possible?
At Windows you may use PHP form Wamp Server, its very straightforward installation. You don't need MySQL Server at your local PC to update remote AWS with the data but only a scripting language.
I would suggest to install MySQL on your local PC to check firstly on that local MySQL if the update does what you expect it to do. Once it meets your expectation you just change the MySQL connection parameters to these from AWS and your set.
In MySQL Workbench you may add additional MySQL Server as local to check the local database and all applied to it changes
Perhaps this example can help link you to do first steps in writing php script to update database
PHP scripts can be executed form command line as well so once you write your script that updates the database you should be able to run it from windows CMD console this way
php -f path-to-your-sript.php
but if so you need to edit php scipt this way that it already knows where the csv file is and reads its content, maybe by this function file_get_contents() or you can also give a try a dedicated to csv files function fgetcsv() that is even more suitable because it reads line by line your CSV file so if you use loop you can even process very big CSV files without running out of the memory.
I am trying to create a shell script as part of my cron job for my MySQL daily backups. I am using Webmin as the GUI interface and I was told to insert a small shell script as part of the Command to run after backup option in Webmin per noisemarine's response in this post: https://www.virtualmin.com/node/54190.
Currently, I have to manually create a folder named after the current date (e.g. 6/6/2018) and move my four MySQL database backup files into this newly created folder. If I don't, the backup files will be overwritten by the next day's backup.
All my MySQL backups are stored in /etc/mysql/MySQL Backups/. Since I am still new to this, I need help in knowing which commands/script I need to perform for the following:
Automatically create a new folder each day named after the current date
(e.g. 6/6/2018) that's stored in the /etc/mysql/MySQL Backups/ directory.
Automatically move all four .sql backup files into the newly created folder
after an automatic MySQL backup (e.g. cron job automatically backups MySQL
and subsequently, the command used in the script will automatically move the
backup files into the 6/6/2018 folder)
I tried using the following commands separated by a ; as notated by noisemarine but to no avail:
BDIR=/etc/mysql/MySQL Backups/ ; DATE=`date +%F ; mkdir -p $BDIR/$DATE ;mv $BDIR/*sql $BDIR/$DATE/
I don't know what's wrong with this picture.
Again, it looks like all I need to do is insert a list of commands in the box titled Command to run after backup option in Webmin (see attached image for visual representation).
Any help would be generously appreciated!
Thank you!
Webmin Configuration screenshot
for my database systems assignment after we create somethings in mySQL we need to record our executions in a script file to print off to show we actually did it.
I know how to create a script file in Linux but I cannot see how to do one in Windows?
In MySql DB, How to populate a specific table with the data of an excel file every day.
I want this job to run every morning at 8.00am.
How can i do this?
You cannot do that with MySQL Workbench. You need some external timer service (cron job in linux/mac, scheduler service in Win) to run a batch file to make the import.
I needed to insert data to a MySQL table. The data usually comes from the Oracle database by a SQL file which has some query to get data from Oracle database table.
So i needed to automate this process on a daily basis using Shell script. So it will run the SQL file (Oracle) and get data. The data has to be moved to a specific MySQL table.
So the operation like,
Connecting to Oracle server.
Executing SQL in that server.
Move collected data to the MySQL table.
I want to run that operation as a cron on daily basis.
I am working on LAMPP environment.
Here are my questions,
Is there any standard tools available to do that?
Can we achieve this using shell script? If so, Please suggest me the steps.
Or It would be great if you suggest me your own optimized way.
Thanks,
Raja.
You can achieve this using a cron job,
Write a sql file which exports a data from Oracle DB i.e. sql file containing export table command with appropriate where clause
Write a sql file which imports data into Mysql DB i.e. similar import file command in Mysql
Write a shell file which run both these sql files and verifies the data in Mysql table
Schedule a cron job to run this shell script daily at a specific hr.
Please check import/export file formats on Oracle and Mysql which can be different as one is freeware and other is commercial. If there are differences then you will need some data/file modification otherwise this should be enough.