export mysql database schema and data using phpmyadmin - mysql

I'm trying to export a database with phpmyadmin but the result is a file that can't be read. In phpmyadmin I leave the options and only add compression. After downloading I'm unable to open the resulted .sql file (after extracting) in geany and get a complaint about the file being in an encoding that's not supported. Every time I do the export the file seems to be of a random size as if the process just stops somewhere.
It's a small database with joomla stuff that needs to be moved to another site and I would like to use mysldump but don't have ssh access. Other than the phpmyadmin in the cpanel I can't think of another way to access that data but phpmyadmin doesn't seem to be up for doing the job.
Is there any setting I should have to have a look at or some other way to get that data exported?

Try disabling compression or using another compression method. There have been a few bugs in phpMyAdmin. Maybe you are stumbling about one of them. Also try opening the sql file in an ordinary text editor and check the content maybe you can get some more information about the problem you are experiencing.
If you have access to the commandline you can also try the following command: mysqldump -u <username> -p <database_name> > dumpfilename.sql.

Related

How to migrate 10GB old table to new table? [duplicate]

I am using phpmyadmin on my windows os. I have a database with one table which has 100M records with the size of 20GB. I want to export this table and have the table.sql file. Whenever I try to do this, the size of the exported file is 0 bytes. When I check the apache error log, the following would show up:
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 1066139648 bytes)
Any idea how to solve this problem?!
Thanks :)
I would suggest to try using the command line and the mysqldump.exe utility, as suggested here
If you have shared hosting and you are using Cpanel then they are providing you the option to backup your database in the following section.
Files => Backup => Download a MySQL Database Backup.
If you are on a shared hosting or you don't have access to shell, then use mysqldumper script; copy it to your server and start it in your browser under "yourDomain.com/path_to_mysqlumper/"
MySQLDumper is a PHP and Perl based tool for backing up MySQL
databases. You can easily dump your data into a backup file and - if
needed - restore it. It is especially suited for shared hosting
webspaces, where you don't have shell access.
If you have shell access to your host servers (if provided by your host since not all shared server hosters give this access) then you may use SSH access like in this tutorial using Putty that you install and configure then import or export your databases like in this third tutorial.
I try mysqldump for many hours but didnt work, until I started a superuser console.
First, start a superuser console
sudo su
Then, try the complete command
/opt/lampp/bin/mysqldump -u root -p [DATABASE NAME] > [PATH_FOR_BACKUPFILE]/[FILE_NAME].sql
I my case, it was something like /opt/lampp/bin/mysqldump -u root -p database > /home/user/backup.sql
MySQLDumper worked like a charm for me at my hosted website. I had to copy one database and "paste" it into a new database. In MySQLDumper, it isn't apparent right away how to do this, but the key is to create a new configuration file in MySQLDumper and that will allow you to copy/restore to different databases.
On the home screen in MySQLDumper, click Configuration, then Configuration Files. There is a text box at the top allowing you to create a new Configuration file. In there, put in the information for the second database you need (you created a connection to the first database when you install MySQLDumper). Save it. Then you can click Restore where you can select the dump of the first database and restore it in the second one.
This was a lifesaver. Thanks!
increase
max_post_size
variable in php.ini file. Then you will be able to download it.
I had a different issue when I was downloading from phpmyadmin in the middle like 180MB download stops with message - network error
So I used ssh connection which you can find in your cpanel sometimes they provide browser based terminal or sometime you have to access it using putty .
In terminal I go inside my public_html folder where all my files are stored . Followed by this command:
mysqldump -u [username] -p [database-you-want-to-dump] > [path-to-place-data-dump.sql]
This did the job in few minutes and saved a sql file in my public_html folder. Then I opened the folder in File manager and downloaded it from there.
You can also use FTP or you can download it directory by accessing by url.
Make sure you delete it after your download finishes.

1000mb database file is not uploading completely

We have a Magento site.
We have a large database of 1000MB for that site.
When we try to import database through phpmyadmin, it's not uploading the full database. Lot of tables are missing after uploading.
Qe are getting this error :
Script timeout passed, if you want to finish import, please resubmit
same file and import will resume
We have a GoDaddy cpanel and phpmyadmin.
Is there any way to upload a large database completely?
I am planning to change php.ini settings as mentioned in this question:
How to solve time out in phpmyadmin?
But I could not able to find php.ini file.
Have I need to edit the default Magento phi.ini.sample?
use sqlsplitter, It will created small chunk of you large sql file and then upload all these small files
Try like this in your Godaddy SSH,
Log on to your server.
At the command line, type su -. This gives you root access.
At the command line, type mysql -uroot -p
Enter your MySQL root password.
Select yourdatabase name for insert data by writing this command, USE YOURDATABASENAME
Then load source sql file as SOURCE D:\profile20\database.sql
Press enter for insert data.
This video link may help you.

Splitting MySQL database into tables separate

Well I'm not much of a good developer or a database expert. But I have a little understanding of these things. I'm trying to dump a database on a VPS using "mysqldump" command which works perfectly. But when I tried to restore locally after downloading the dump, gives me a time out error.
Can anyone advise me how to dump a database by splitting it into tables separately. The database I'm referring to is pretty large (6 - 7 GB). I actually tried searching and it confuses me.. even this link here confuses me as where to start.
Any help is highly appreciated.
Are you restoring with phpmyadmin? If you try to upload the import it is probably too large.
You can set a directory where the backup files are stored, then you can select the file in phpmyadmin without uploading it.
For the timeout with importing you can increase the timeout settings, or use something like "BigDump"
If you're using mysqldump I'll assume you're familiar with the command line.
On your local machine, use
mysql -u [root] -p [database_name] < [database_dump.sql] -v
enter password: ********
The empty database needs to be created your local machine first before you can import the structure and data to it (as simple as doing CREATE DATABASE [database_name];)
The -v flag will do it in 'verbose' mode so you can see the queries as they run. Omitting '-v' will stop it filling your window with the queries but will also give you that 'is it working or not?' nervous feeling after a few minutes.
This will work on Windows as well as it works on Linux / Mac / anything else
Replace my [placeholders] with your own values.
Thank you so much for all your answers! Well, what I was looking for is a dumping method or a similar script to dump the database table by table. Finally I tried the dumping the output file with a .txt extension which returned me with success.
Below is the command I used (I know its pretty long proceess, but I finally got all tables dumped);
mysqldump -u users -p database_name table_name > table_name.txt
I used the current directory to output the file assuming I'm already in the directory where I need to dump. If you need to dump the output file to a specific dir, then use /path/to/the/dump/table_name.txt instead of just mentioning the table name. Ans make sure you don't enter password after -p. I don't know why, but I left it blank and it prompts for the password. Then when I type password it dumps to a text file.
I hope this helps.!
Once again thank you so much for the users who came in the first place to help me. :)

export large database mysql phpmyadmin

I am using phpmyadmin on my windows os. I have a database with one table which has 100M records with the size of 20GB. I want to export this table and have the table.sql file. Whenever I try to do this, the size of the exported file is 0 bytes. When I check the apache error log, the following would show up:
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 1066139648 bytes)
Any idea how to solve this problem?!
Thanks :)
I would suggest to try using the command line and the mysqldump.exe utility, as suggested here
If you have shared hosting and you are using Cpanel then they are providing you the option to backup your database in the following section.
Files => Backup => Download a MySQL Database Backup.
If you are on a shared hosting or you don't have access to shell, then use mysqldumper script; copy it to your server and start it in your browser under "yourDomain.com/path_to_mysqlumper/"
MySQLDumper is a PHP and Perl based tool for backing up MySQL
databases. You can easily dump your data into a backup file and - if
needed - restore it. It is especially suited for shared hosting
webspaces, where you don't have shell access.
If you have shell access to your host servers (if provided by your host since not all shared server hosters give this access) then you may use SSH access like in this tutorial using Putty that you install and configure then import or export your databases like in this third tutorial.
I try mysqldump for many hours but didnt work, until I started a superuser console.
First, start a superuser console
sudo su
Then, try the complete command
/opt/lampp/bin/mysqldump -u root -p [DATABASE NAME] > [PATH_FOR_BACKUPFILE]/[FILE_NAME].sql
I my case, it was something like /opt/lampp/bin/mysqldump -u root -p database > /home/user/backup.sql
MySQLDumper worked like a charm for me at my hosted website. I had to copy one database and "paste" it into a new database. In MySQLDumper, it isn't apparent right away how to do this, but the key is to create a new configuration file in MySQLDumper and that will allow you to copy/restore to different databases.
On the home screen in MySQLDumper, click Configuration, then Configuration Files. There is a text box at the top allowing you to create a new Configuration file. In there, put in the information for the second database you need (you created a connection to the first database when you install MySQLDumper). Save it. Then you can click Restore where you can select the dump of the first database and restore it in the second one.
This was a lifesaver. Thanks!
increase
max_post_size
variable in php.ini file. Then you will be able to download it.
I had a different issue when I was downloading from phpmyadmin in the middle like 180MB download stops with message - network error
So I used ssh connection which you can find in your cpanel sometimes they provide browser based terminal or sometime you have to access it using putty .
In terminal I go inside my public_html folder where all my files are stored . Followed by this command:
mysqldump -u [username] -p [database-you-want-to-dump] > [path-to-place-data-dump.sql]
This did the job in few minutes and saved a sql file in my public_html folder. Then I opened the folder in File manager and downloaded it from there.
You can also use FTP or you can download it directory by accessing by url.
Make sure you delete it after your download finishes.

Import a large database - CPanel & MySQL & PHPMyAdmin

I have a database which has the size over 100 MB. It has the .sql.gz which means it is compressed. When I try to import it using PHPMyAdmin I get time out errors. I even tried partial imports ( Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be good way to import large files, however it can break transactions.) ) which does not work for me. Given that I'm using CPanel & PHPMyAdmin to get the job done.
How can import this database?
You can use BigDump for this
Adjust the database configuration and charset in this file
Remove the old tables on the target database if your dump doesn't contain "DROP TABLE"
Create the working directory (e.g. dump) on your web server
Upload bigdump.php and your dump files (.sql, .gz) via FTP to the working directory
Run the bigdump.php from your browser via URL like
http://www.yourdomain.com/dump/bigdump.php
BigDump can start the next import session automatically if you enable the JavaScript
Wait for the script to finish, do not close the browser window
IMPORTANT: Remove bigdump.php and your dump files from the web server
If Timeout errors still occure you may need to adjust the $linepersession setting in this file. Read more
Very Simple and Effective Technique to import large mysql databases.
Using Command Line console
you can access command line using putty for server and execute following command
mysql -u 'database_user' -p'database_password' database_name < sql_file_name_with_full_path
Using Cron Job [if console is not available]
add a cron job in cpanel with following command
mysql -u 'database_user' -p'database_password' database_name < sql_file_name_with_full_path
Do not forget to delete cron job after importing database. Also delete sql file from server.
for both techniques commands are same
Example
mysql -u 'test_user' -p'123456' test_db < /home1/test/public_html/db.sql
or if you do not have password
mysql -u 'test_user' test_db < /home1/test/public_html/db.sql
In my opinion and experience with hepsiawebhosting.com server, it's better and faster to set those parameters under WHM settings called "Tweak setting". Under Tweak settings you can modify maximum upload and maximum post, and you can could, for example, change upload to 120 MiB and post to 125 MiB.
Also, in PHP settings under WHM, you can change the default socket and SQL time out. You need to increase those settings.
You can also unzip the database and try to upload it as a simple .sql file.
I hope this will help you.
why this is not published, what is missing here?