I am currently working on a project that requires an automated export of a CSV file from mySQL. I am using Cpanel with phpMyAdmin.
This is not an export of the entire database so I am unable to simply set a CRON task to do a mySQLDump. I have a procedure in place that links together the tables I need and a way to run that procedure as a scheduled task, but I now need a way to actually export the CSV with the data in that procedure creates and save the CSV file on the server.
Any ideas how to do this please?
I think you will need to write a script to do the custom export.
If you are familiar with PHP, you can use the MySQLi library to connect to the database: http://php.net/manual/en/book.mysqli.php
Once you have the rows to export, you can write them to a CSV file using: http://php.net/manual/en/function.fputcsv.php
Finally, setup a cron to run the script as frequently as needed.
For example:
* * * * * /usr/bin/php my_export_script.php
Related
Im trying to import a pretty big .sql file with data grip. It's called file.sql and I need it in the schemas section. But it only allows me to upload it under a schema file. Will it still work?
Yes, it'll be fine.
Also, there're several ways to run "big" files:
Attach a file to project and execute Run
Run from Database Tool window via Run SQL Script or use mysql
I have a data dump file(.sql) containing certain data, I would like to import it to my database(currently Sqlite3 but thinking about changing to MySQL) in order to use it in my test website.
Also, I need to know if it's possible to add the models too automatically, I presume it needs to be added manually, but any how, if there is any way to solve it, please suggest it.
There is a way to help you generate the Django models automatically given you have an existing database.
However this is a shortcut. You may then fine tune your models and add them to your app as needed.
Structuring your models in apps might force you to use the Models db_table meta option.
If at some point you would like to switch databases (Sqlite3 -> MySQL) you can export (dump) your current data to json. Then you could import (load) the data to the new database (after creating the database tables with migrate command). To do this you can use Django management commands:
Dump data
Load data
I was able to get an alternative answer after researching a bit.
Since I'm having a PostgreSQL data dump file with a file extension '.sql', I was capable of running a single command that imported the whole data dump into my local database, which is PostgreSQL. I'm using PgAdmin4 as my database management system and I installed psql during the installation of PgAdmin4, I added the psql to the path of my command prompt, hence it was accessible.
In order to import the data dump, I used the command provided below,
psql -U <username> -d <database_name> < <file.sql>
The '<' after database_name is necessary, so be sure to include it.
Here is the username of the configured account, is the database to which the data dump should be added, , is the file containing the data dump.
I am trying to automate the upload of data to a mysql database. I am using MySql Workbench on a windows pc to remotely access my database on AWS. I have a sql file that I use to load a csv file into the db using LOAD DATA LOCAL INFILE. My csv file is created daily using a scheduled task and I would like to load it into the db using a batch type file and a scheduled task.
Is it possible?
At Windows you may use PHP form Wamp Server, its very straightforward installation. You don't need MySQL Server at your local PC to update remote AWS with the data but only a scripting language.
I would suggest to install MySQL on your local PC to check firstly on that local MySQL if the update does what you expect it to do. Once it meets your expectation you just change the MySQL connection parameters to these from AWS and your set.
In MySQL Workbench you may add additional MySQL Server as local to check the local database and all applied to it changes
Perhaps this example can help link you to do first steps in writing php script to update database
PHP scripts can be executed form command line as well so once you write your script that updates the database you should be able to run it from windows CMD console this way
php -f path-to-your-sript.php
but if so you need to edit php scipt this way that it already knows where the csv file is and reads its content, maybe by this function file_get_contents() or you can also give a try a dedicated to csv files function fgetcsv() that is even more suitable because it reads line by line your CSV file so if you use loop you can even process very big CSV files without running out of the memory.
I needed to insert data to a MySQL table. The data usually comes from the Oracle database by a SQL file which has some query to get data from Oracle database table.
So i needed to automate this process on a daily basis using Shell script. So it will run the SQL file (Oracle) and get data. The data has to be moved to a specific MySQL table.
So the operation like,
Connecting to Oracle server.
Executing SQL in that server.
Move collected data to the MySQL table.
I want to run that operation as a cron on daily basis.
I am working on LAMPP environment.
Here are my questions,
Is there any standard tools available to do that?
Can we achieve this using shell script? If so, Please suggest me the steps.
Or It would be great if you suggest me your own optimized way.
Thanks,
Raja.
You can achieve this using a cron job,
Write a sql file which exports a data from Oracle DB i.e. sql file containing export table command with appropriate where clause
Write a sql file which imports data into Mysql DB i.e. similar import file command in Mysql
Write a shell file which run both these sql files and verifies the data in Mysql table
Schedule a cron job to run this shell script daily at a specific hr.
Please check import/export file formats on Oracle and Mysql which can be different as one is freeware and other is commercial. If there are differences then you will need some data/file modification otherwise this should be enough.
Is there any command to export Access Database table to MySQL using command line.
i know we can do it manually. but i want to do it automatic on specific schedule. so if any command will there than we can do it using command line.
Once we done with it we can create batch file and we schedule it.
Can any one suggest me which one is best.
thank you.
You can use Data Import feature (MS Access format) in dbForge Studio. Command line is supported for Data Import. Just create import template file and use it in your schedule task. Try trial version).