Automate Putty To Do Daily Task - mysql

I am fairly new to putty, need your help guys I would repeatedly connect to a linux server using putty almost everyday then connect to one more linux sever from within previous linux server as it contains database, then login to mysql and take a database back up
I want to automate this process by creating a bat file as am on windows 7, i was able to login to putty and then to mysql, but console disappears after that
login to putty with username password
ssh to another server and login with username password
login to mysql then take a db back up using mysql dump
copy the db backup file to desktop
Thanks in advance

When your goal is making the backup, you do not need to automate Putty. You can make a unix script that will call mysql and make a backup. When that script works, you can add it to crontab (Unix scheduler), so it will run every day.
You need some more testing before crontab works well: your environment in an interactiv session will be different.
Edit: Did not answer the last part:
And 4. copy the db backup file to desktop
There are different ways for transporting the backup. You can have a mounted drive, shared directory, or use some transport protocol like rsync or scp. I can not tell which fits best in your situation.

Related

How to migrate 10GB old table to new table? [duplicate]

I am using phpmyadmin on my windows os. I have a database with one table which has 100M records with the size of 20GB. I want to export this table and have the table.sql file. Whenever I try to do this, the size of the exported file is 0 bytes. When I check the apache error log, the following would show up:
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 1066139648 bytes)
Any idea how to solve this problem?!
Thanks :)
I would suggest to try using the command line and the mysqldump.exe utility, as suggested here
If you have shared hosting and you are using Cpanel then they are providing you the option to backup your database in the following section.
Files => Backup => Download a MySQL Database Backup.
If you are on a shared hosting or you don't have access to shell, then use mysqldumper script; copy it to your server and start it in your browser under "yourDomain.com/path_to_mysqlumper/"
MySQLDumper is a PHP and Perl based tool for backing up MySQL
databases. You can easily dump your data into a backup file and - if
needed - restore it. It is especially suited for shared hosting
webspaces, where you don't have shell access.
If you have shell access to your host servers (if provided by your host since not all shared server hosters give this access) then you may use SSH access like in this tutorial using Putty that you install and configure then import or export your databases like in this third tutorial.
I try mysqldump for many hours but didnt work, until I started a superuser console.
First, start a superuser console
sudo su
Then, try the complete command
/opt/lampp/bin/mysqldump -u root -p [DATABASE NAME] > [PATH_FOR_BACKUPFILE]/[FILE_NAME].sql
I my case, it was something like /opt/lampp/bin/mysqldump -u root -p database > /home/user/backup.sql
MySQLDumper worked like a charm for me at my hosted website. I had to copy one database and "paste" it into a new database. In MySQLDumper, it isn't apparent right away how to do this, but the key is to create a new configuration file in MySQLDumper and that will allow you to copy/restore to different databases.
On the home screen in MySQLDumper, click Configuration, then Configuration Files. There is a text box at the top allowing you to create a new Configuration file. In there, put in the information for the second database you need (you created a connection to the first database when you install MySQLDumper). Save it. Then you can click Restore where you can select the dump of the first database and restore it in the second one.
This was a lifesaver. Thanks!
increase
max_post_size
variable in php.ini file. Then you will be able to download it.
I had a different issue when I was downloading from phpmyadmin in the middle like 180MB download stops with message - network error
So I used ssh connection which you can find in your cpanel sometimes they provide browser based terminal or sometime you have to access it using putty .
In terminal I go inside my public_html folder where all my files are stored . Followed by this command:
mysqldump -u [username] -p [database-you-want-to-dump] > [path-to-place-data-dump.sql]
This did the job in few minutes and saved a sql file in my public_html folder. Then I opened the folder in File manager and downloaded it from there.
You can also use FTP or you can download it directory by accessing by url.
Make sure you delete it after your download finishes.

Automated download of whole mysql database without mysqldump

On my shared webhost, I have very limited ssh access (only via imscp instantSSH plugin). I want to set up a script to download my whole mysql database as an sql file, but I cant figure it out.
I cant use mysqldump so I tried the mysql command, which is available, but it isn working.
I have to specify username, password, host and database, and my password contains special characters.
Anyone can help me?
If your shared host allow mysql remote connections then you can use any MySQL software to connect to the database and then extract whatever information you need. Tools like these are: HeidiSQL, NavicatGui etc.
Another way would be the one suggested by Akshay Khale.
A 3rd one would be to use phpMyAdmin (most shared web hosting have this installed by default).
A 4th one would to create a simple php script that runs mysqldump locally, saves the .sql dump file either locally on your shared hosting or remotely via FTP/SFTP or any other protocal. Also the same script can email you that file (inline or as an attachment to the email). This kind of automation can be configured using a cron job.
There are multiple ways to achieve this. It all depends on which one suits you best.
As the author of the InstantSSH plugin for i-MSCP, I can say you that you should ask your HPs to make the mysqldump command available from your chrooted shell. The host administrator can add any command to the chroot by editing the InstantSSH plugin configuration file.

Export mysql datatabase from a dead hard drive - xampp

I have a dead hard drive. I've connected it with SATA to USB IDE and can recover files. Also the filesystems looks good. How Can I get a dump of database from that hard drive. The laptop I'm using also has mysql installed in it. I'm using xampp. I've tried the following command.
G:/xampp/mysql/bin/mysqldump -u root -p uma > D:/umaoldbackup.sql
This is not giving dump of a latest data. And I think it's giving dump from my local computer.
Please help.
Correct, mysqldump connects to the running MySQL Server process on your local computer, not the data on your sick hard drive.
MySQL client applications like mysqldump do not read the data files directly. They connect to a MySQL Server process and make requests for the data. Before you can access that data, you need to restore the data files to the data directory of an instance of MySQL Server.
Stop the MySQL service.
Copy data files to the data directory of your MySQL service. Move any existing data files to someplace else that is safe, if you want to bring back that data after exporting your umaoldbackup.
Start the MySQL service so it can read the files in that data directory.
You should probably get someone to do this for you if you don't know how to start and stop services on Windows.
Re your comment about where is the data directory...
I'm not a user of Windows or XAMPP.
It's probably actually C:\xampp\mysql\data according to What is the exact location of Mysql database tables in XAMPP folder?
But you can confirm that by connecting to your current MySQL service with a client and running the following query:
SELECT ##datadir;

Update my remote MySQL database with my local MySQL database

I have a local Perl script that does a lot of parsing of web pages and then successfully updates my local MySQL database (WAMP server). I now want to send this local data to my remote server, but remotely connecting to my database isn't allowed with my hosting company. Unfortunately I never thought of that problem.
So, I now need to find an automated way to update my remote server (every 15mins). I mistakenly thought I could just edit my Perl script with the details of the remote server.
I am aware that I could use CGI or PHP to do the parsing on the server, but I really want to keep the parsing local for now.
Summary:
Local MySQL database -> remote MySQL database every 15mins ??
Any ideas what I can do?
Thanks :-)
if replication is not an option but you can still establish an ssh connection from local box to remote box, then
run mysqldump to export data into a file http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html#option_mysqldump_where
scp file to remote box
mysql -u username -p password database_name < dumpfile.sql
If your server does not accept connections to mysql remotely you can create a ssh tunnel. Then you can apply the replication solution proposed by matcheek.
Here is a hint: http://realprogrammers.com/how_to/set_up_an_ssh_tunnel_with_putty.html
Based on the responses I've received, I think the answer to my original question is to stop using a cheap shared hosting company (no remote access to server, no cron jobs, etc) and start using a VPS hosting company. That will give me the freedom to remotely connect to my server, etc.
Thanks again to those who replied.
From how you described the problem replication seems to be the way to go
http://dev.mysql.com/doc/refman/4.1/en/replication-howto.html
Using a cron job could be another option. It would read file from your local machine and import data in the remote box.
I suggest the follwing:
On every local run, write the SQL statements (sans SELECT),
that you run against your copy of the DB also into a file
On your WAMP server create a small PHP script, gives back the oldest script from the first step (soem auth ofcourse)
On your remote server run a cronjob, that gets this from your local server and runs the SQL against the DB, then acknowledges it
On acknowledgement on your WAMP server, drop the file and give back the next one.
While this seems complicated, it allows for a restart after connectivity loss - something that I consider imposrtant.

Access Joomla Database

I'm trying to move a joomla site from one person's hosting account to another. Thus I'm really limited in which tools I have.
The problem I'm having to trying to get access to the mysql database. I don't have access to ssh, cpanel, phpmyadmin, or anything else of the nature, just joomla admin and ftp access.
However joomla gives me some database settings.
But when I try to connect to the given hostname, nothing happens.
I'm trying something like and it just hangs
mysql -u myusername -h thegivendomain.db.3456321.hostedresource.com jos_giventable name
I even tried digging the hostname and tried using the ip for the hostname.
and ideas?
It is reasonable to assume that their firewall just blocks remote access to the MySQL service, so your attempts to connect don't reach their destination.
Seeing how you are familiar with shell in general, you could try a PHP shell wrapper, such as this one I made for myself - if the hosting company is not too uptight about their security, this script will be able to run shell commands, e.g. mysqldump -uusername -ppassword db > backup.sql.
Install the free Akeeba Backup it will make a complete copy of the website that you can download. Upload that to the new server and install with their Kickstart script.
Run Kickstart from any browser on the new host and follow the steps.