Expand mysql memory - mysql

I am trying to import data base, that is more than 128 MB limit, to mysql on XAMMP machine. How can I expand the mysql memory on XAMMP?

Assuming you're using PHPMyAdmin, you can't - that's a hard limit.
To do it via PHPMyAdmin, you need to split your import files up and run them separately
Alternatively, use a different MySQL client, like the mysql command line client that comes as part of XAMPP, and allows you to run scripts directly.
The mysql command allows you to connect to the database using the client, and then run the script from there, like this:
C:\xampp\mysql\bin>mysql.exe -u root
mysql> source C:/mysqlscripts/import.sql;
Note the forward slashes in the file path
Or, you can run it on the windows commandline with the file as a parameter, as described in the documentation.

Related

Docker MySQL server failed [duplicate]

I am trying to import time zones according to this document: http://dev.mysql.com/doc/refman/5.7/en/mysql-tzinfo-to-sql.html.
When I try hitting even first command through terminal i.e.
mysql_tzinfo_to_sql tz_dir
it says
There were fatal errors during processing of zoneinfo directory 'tz_dir'
When I run:
mysql_tzinfo_to_sql /usr/share/zoneinfo | mysql -u root mysql
then it returns
ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock'
Operating System: Ubuntu
Server: XAMPP (having mariadb)
So, I am not able to import timezone data into the zone tables of mysql database.
I also had this issue this morning while trying to populate the timezone tables on our production server (using CentOS).
I worked around this problem by exporting the table creation script from my development computer (populating the tables on Windows simply requires to download the tables and copy them in the proper directory).
I tweaked the creation script a little bit and it is now working for me:
I cannot post it in my answers since the script is too long.
https://drive.google.com/file/d/0B7iwL5F-hwr_YkItRXk2Z1VZdlE/view?usp=sharing
Here's a version with the comment (but it doesn't seems to work, so use the version without comments).
https://drive.google.com/file/d/0B7iwL5F-hwr_dWdjTDREcXNHQmM/view?usp=sharing
The script should take no more than a few second to run. You'll probably need to use the root user to be able to run it.
You can use this query in order to validate the installation:
SELECT CONVERT_TZ(CURRENT_DATE(),'UTC','America/Montreal');
If it returns NULL instead of a datetime, it means the script failed.
Good success
The
There were fatal errors during processing of zoneinfo directory 'tz_dir'
error message means that the directory cannot be read (not enough access rights or not even exists).
Knowing that the mysql_tzinfo_to_sql program is just a tool that converts a bunch of timezone files into an SQL script that you can use to install the time zones for mysql, your task is the following:
obtain the timezone files from somewhere
execute mysql_tzinfo_to_sql to create an SQL script from those files
execute that SQL script in your mysql database.
These steps can be performed on different computers if you transfer the files between them. For example, I installed the timezones on a machine where the mysql installation was not complete, that is, mysql_tzinfo_to_sql was not available and I wasn't able to install it either.
In such a case you can combine the following steps:
if mysql_tzinfo_to_sql is not available on the computer where your mysql database resides then find a computer where mysql_tzinfo_to_sql is already installed
make the zoneinfo folder available on that computer. It is just a bunch of files in different folders so you can transport them in a gzip file from one computer to the other. In a normal mysql installation this folder should exist but maybe your installation is not complete, su just get it from anywhere.
execute the mysql_tzinfo_to_sql command to create an SQL script like this:
mysql_tzinfo_to_sql path-to-your-zoneinfo-folder >install_mysql_zoneinfo.sql
move the created SQL script to the computer where your mysql database resides *
execute the script like this:
mysql --user=root --password=abc123 mysql <install_mysql_zoneinfo.sql
Adjust the username and password if needed and your script will be executed. This will fill up the timezone-related tables with the appropriate values and you will be able to use them:
SELECT convert_tz(NOW(),'UTC','Australia/Melbourne');
if you can reach the mysql database from the computer where the SQL script was generated then it's enough to add the -h <hostname> command line argument to the subsequent script-executing program and you will not have to copy the SQL script to the target machine.

Mysqldump on unencrypted Amazon RDS is creating encrypted .db file

I'm trying to create a dump of my RDS data and use it locally.
I've used the command:
mysqldump -h myhostname.rds.amazonaws.com -u my_username -p my_dbname > ~/Downloads/dump.sql
When I try to view this data in a tool like DB Browser for sqlite, I get a prompt saying it's encrypted, asking for a password.
I thought maybe it needed to be converted into sqlite first, so I've done this in RazorSQL -> But I still get the same issue. Also, when I try to load the DB into NodeJS's sqlite module I get:
not able to query Table in SQLite DB Error: SQLITE_NOTADB: file is encrypted or is not a database
I've checked my RDS settings, and it says:
Encryption details:
Encryption enabled
No
So I have no idea what's going on here. Any tips? Does the file extension (.sql, .db etc) make a difference here?
When I try to view this data in a tool like DB Browser for sqlite
Sqlite is an entirely different thing than MySQL. There is very little overlap in tools that can work with both.
You're using a tool that can't be used for the purpose to which you're applying it, so you're getting a confusing error:
file is encrypted or is not a database
In other words, the tool is unable to make sense of the file, so one of two things is has happened: the file is encrypted or is not a [sqlite] database [at all].
The problem is the latter.
The file is not encrypted. Even if the RDS instance is encrypted, the generated dump file would still not be encrypted, because encryption in RDS is storage-level encryption of the data, at rest, on the disk volume backing the RDS instance. Encryption in RDS is transparent to the user.
The problem is that what you have here is a dump file -- a series of SQL statements that can be used to reconstruct your database on another MySQL server.
Your file is plain text. You can view it will a text editor. What you can't do is use the file as a database -- that's something Sqlite can do, because Sqlite stores the database inside a single, transportable file. MySQL is a different architecture.
You'll need to have the same version (e.g. 5.7.x) of MySQL Server installed locally, and then load this file onto it.
shell> mysql [options] < my_dump_file.sql
To reload a dump file written by mysqldump that consists of SQL statements, use it as input to the mysql client.
https://dev.mysql.com/doc/refman/5.7/en/reloading-sql-format-dumps.html
You can also use query browser tools for MySQL like Toad or Workbench, but a local MySQL Server is required.

export large database mysql phpmyadmin

I am using phpmyadmin on my windows os. I have a database with one table which has 100M records with the size of 20GB. I want to export this table and have the table.sql file. Whenever I try to do this, the size of the exported file is 0 bytes. When I check the apache error log, the following would show up:
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 1066139648 bytes)
Any idea how to solve this problem?!
Thanks :)
I would suggest to try using the command line and the mysqldump.exe utility, as suggested here
If you have shared hosting and you are using Cpanel then they are providing you the option to backup your database in the following section.
Files => Backup => Download a MySQL Database Backup.
If you are on a shared hosting or you don't have access to shell, then use mysqldumper script; copy it to your server and start it in your browser under "yourDomain.com/path_to_mysqlumper/"
MySQLDumper is a PHP and Perl based tool for backing up MySQL
databases. You can easily dump your data into a backup file and - if
needed - restore it. It is especially suited for shared hosting
webspaces, where you don't have shell access.
If you have shell access to your host servers (if provided by your host since not all shared server hosters give this access) then you may use SSH access like in this tutorial using Putty that you install and configure then import or export your databases like in this third tutorial.
I try mysqldump for many hours but didnt work, until I started a superuser console.
First, start a superuser console
sudo su
Then, try the complete command
/opt/lampp/bin/mysqldump -u root -p [DATABASE NAME] > [PATH_FOR_BACKUPFILE]/[FILE_NAME].sql
I my case, it was something like /opt/lampp/bin/mysqldump -u root -p database > /home/user/backup.sql
MySQLDumper worked like a charm for me at my hosted website. I had to copy one database and "paste" it into a new database. In MySQLDumper, it isn't apparent right away how to do this, but the key is to create a new configuration file in MySQLDumper and that will allow you to copy/restore to different databases.
On the home screen in MySQLDumper, click Configuration, then Configuration Files. There is a text box at the top allowing you to create a new Configuration file. In there, put in the information for the second database you need (you created a connection to the first database when you install MySQLDumper). Save it. Then you can click Restore where you can select the dump of the first database and restore it in the second one.
This was a lifesaver. Thanks!
increase
max_post_size
variable in php.ini file. Then you will be able to download it.
I had a different issue when I was downloading from phpmyadmin in the middle like 180MB download stops with message - network error
So I used ssh connection which you can find in your cpanel sometimes they provide browser based terminal or sometime you have to access it using putty .
In terminal I go inside my public_html folder where all my files are stored . Followed by this command:
mysqldump -u [username] -p [database-you-want-to-dump] > [path-to-place-data-dump.sql]
This did the job in few minutes and saved a sql file in my public_html folder. Then I opened the folder in File manager and downloaded it from there.
You can also use FTP or you can download it directory by accessing by url.
Make sure you delete it after your download finishes.

Importing larger SQL files into MySQL

I have a 400 MB large SQL backup file. I'm trying to import that file into MySQL database using WAMP → import, but the import was unsuccessful due to many reasons such as upload file size is too large and so on. So, I've gone ahead and made a few changes to the configuration of PHP and MySQL settings under WAMP. Changes made are
Under WAMP->MySQL->my.ini
max_allowed_packet = 800M
read_buffer_size = 2014K
Under WAMP->PHP->PHP.ini
max_input_time = 20000
memory_limit = 128M
Is this the right way to import such a large SQL file using WAMP → import?
If so, did I make the right changes to the configuration files? What are the maximum values allowed for the above variable?
You can import large files this command line way:
mysql -h yourhostname -u username -p databasename < yoursqlfile.sql
Since you state (in a clarification comment to another person's answer) that you are using MySQL Workbench, you could try using the "sql script" option there. This will avoid the need to use the commandline (although I agree with the other answer that it's good to climb up on that horse and learn to ride).
In MySQL Workbench, go to File menu, then select "open script". This is probably going to take a long time and might even crash the app since it's not made to open things that are as big as what you describe.
The better way is to use the commandline. Make sure you have MySQL client installed on your machine. This means the actual MySQL (not Workbench GUI or PhpMyAdmin or anything like that). Here is a link describing the command-line tool. Once you have that downloaded and installed, open a terminal window on your machine, and you have two choices for slurping data from your file system (such as in a backup file) up into your target database. One is to use 'source' command and the other is to use the < redirection operator.
Option 1:
from the directory where your backup file is:
$mysql -u username -p -h hostname databasename < backupfile.sql
Option 2:
from the directory where your backup file is:
$mysql -u username -p -h hostname
[enter your password]
> use databasename;
> source backupfile.sql
Obviously, both of these options require that you have a backup file that is SQL.
Use this from mysql command window:
mysql> use db_name;
mysql> source backup-file.sql;
You can make the import command from any SQL query browser using the following command:
source C:\path\to\file\dump.sql
Run the above code in the query window ... and wait a while :)
This is more or less equal to the previously stated command line version. The main difference is that it is executed from within MySQL instead. For those using non-standard encodings, this is also safer than using the command line version because of less risk of encoding failures.
We have experienced the same issue when moving the sql server in-house.
A good solution that we ended up using is splitting the sql file into chunks. There are several ways to do that. Use
http://www.ozerov.de/bigdump/ seems good (but never used it)
http://www.rusiczki.net/2007/01/24/sql-dump-file-splitter/ used it and it was very useful to get structure out of the mess and you can take it from there.
Hope this helps :)
If you are using the source command on Windows remember to use f:/myfolder/mysubfolder/file.sql and not f:\myfolder\mysubfolder\file.sql
I believe the easiest way is to upload the file using MYSQL command line.
using the command from the terminal to access MySQL command line and run source
mysql --host=hostname -uuser -ppassword
source filename.sql
or directly from the terminal
mysql --host=hostname -uuser -ppassword < filename.sql
at the prompt.
Three things you have to do, if you are doing it locally:
In php.ini or php.cfg of your PHP installation
post_max_size=500M
upload_max_filesize=500M
memory_limit=900M
Or set other values. Restart Apache.
Or
Use the PHP big dump tool. It’s best ever I have seen. It’s free and open source.
Had a similar problem, but in Windows. I was trying to figure out how to open a large MySQL SQL file in Windows, and these are the steps I had to take:
Go to the download website (http://dev.mysql.com/downloads/).
Download the MySQL Community Server and install it (select the developer or full install, so it will install client and server tools).
Open the MySQL command-line client from the Start menu.
Enter your password used in install.
In the prompt, mysql>, enter:
CREATE DATABASE database_name;
USE database_name;
SOURCE myfile.sql
That should import your large file.
The question is a few months old but for other people looking --
A simpler way to import a large file is to make a sub directory 'upload' in your folder c:/wamp/apps/phpmyadmin3.5.2
and edit this line in the config.inc.php file in the same directory to include the folder name
$cfg['UploadDir'] = 'upload';
Then place the incoming .sql file in the folder /upload.
Working from inside the phpmyadmin console, go to the new database and import. You will now see an additional option to upload files from that folder. Chose the correct file and be a little patient. It works.
If you still get a time out error try adding $cfg['ExecTimeLimit'] = 0; to the same config.inc.php file.
I have had difficulty importing an .sql file where the user name was root and the password differed from my the root password on my new server. I simply took off the password before I exported the .sql file and the import worked smoothly.
On item 1.16 of phpMyAdmin they say:
1.16 I cannot upload big dump files (memory, HTTP or timeout problems).
Starting with version 2.7.0, the import engine has been re–written and these problems should not occur. If possible, upgrade your phpMyAdmin to the latest version to take advantage of the new import features.
The first things to check (or ask your host provider to check) are the values of upload_max_filesize, memory_limit and post_max_size in the php.ini configuration file. All of these three settings limit the maximum size of data that can be submitted and handled by PHP. One user also said that post_max_size and memory_limit need to be larger than upload_max_filesize. There exist several workarounds if your upload is too big or your hosting provider is unwilling to change the settings:
Look at the $cfg['UploadDir'] feature. This allows one to upload a file to the server via scp, ftp, or your favorite file transfer method. PhpMyAdmin is then able to import the files from the temporary directory. More information is available in the Configuration of this document.
Using a utility (such as BigDump) to split the files before uploading. We cannot support this or any third party applications, but are aware of users having success with it.
If you have shell (command line) access, use MySQL to import the files directly. You can do this by issuing the “source” command from within MySQL:
source filename.sql;
The simplest solution is MySQL Workbench. Just copy the .sql file text to the query window of MySQL Workbench and just execute it. All renaming things will done by it.
if you want to see the import progress. Install pv with brew and run this:
brew install pv
pv -p dump.sql | mysql -h localhost -u root -p db_name
my.ini
max_allowed_packet = 800M
read_buffer_size = 2014K
PHP.ini
max_input_time = 20000
memory_limit = 128M
post_max_size=128M

Import a large database - CPanel & MySQL & PHPMyAdmin

I have a database which has the size over 100 MB. It has the .sql.gz which means it is compressed. When I try to import it using PHPMyAdmin I get time out errors. I even tried partial imports ( Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be good way to import large files, however it can break transactions.) ) which does not work for me. Given that I'm using CPanel & PHPMyAdmin to get the job done.
How can import this database?
You can use BigDump for this
Adjust the database configuration and charset in this file
Remove the old tables on the target database if your dump doesn't contain "DROP TABLE"
Create the working directory (e.g. dump) on your web server
Upload bigdump.php and your dump files (.sql, .gz) via FTP to the working directory
Run the bigdump.php from your browser via URL like
http://www.yourdomain.com/dump/bigdump.php
BigDump can start the next import session automatically if you enable the JavaScript
Wait for the script to finish, do not close the browser window
IMPORTANT: Remove bigdump.php and your dump files from the web server
If Timeout errors still occure you may need to adjust the $linepersession setting in this file. Read more
Very Simple and Effective Technique to import large mysql databases.
Using Command Line console
you can access command line using putty for server and execute following command
mysql -u 'database_user' -p'database_password' database_name < sql_file_name_with_full_path
Using Cron Job [if console is not available]
add a cron job in cpanel with following command
mysql -u 'database_user' -p'database_password' database_name < sql_file_name_with_full_path
Do not forget to delete cron job after importing database. Also delete sql file from server.
for both techniques commands are same
Example
mysql -u 'test_user' -p'123456' test_db < /home1/test/public_html/db.sql
or if you do not have password
mysql -u 'test_user' test_db < /home1/test/public_html/db.sql
In my opinion and experience with hepsiawebhosting.com server, it's better and faster to set those parameters under WHM settings called "Tweak setting". Under Tweak settings you can modify maximum upload and maximum post, and you can could, for example, change upload to 120 MiB and post to 125 MiB.
Also, in PHP settings under WHM, you can change the default socket and SQL time out. You need to increase those settings.
You can also unzip the database and try to upload it as a simple .sql file.
I hope this will help you.
why this is not published, what is missing here?