I've acquired a back-up file of my company database in the form of a self contained .sql file. I've set up MYSQL on my Win7 box and am trying to import the data into a schema with the same name as guided by MYSQL help files. It does load some data however not all of it is being imported and I get the following in the Import Progress window of the MySQL Workbench:
10:11:15 Restoring C:\Backup\DB798-2016-Feb-07.sql
Running: mysql.exe --defaults-file="c:\temp\tmp4g_coc.cnf" --protocol=tcp --host=localhost --user=root --port=3306 --default-character-set=utf8 --comments --database=DB798 < "C:\\Backup\\DB798-2016-Feb-07.sql"
ERROR 1265 (01000) at line 11251: Data truncated for column 'social_network' at row 1
Operation failed with exitcode 1
10:12:25 Import of C:\Backup\DB798-2016-Feb-07.sql has finished with 1 errors
I don't care about the data in any table labeled 'social_network' the data I really need is somewhere after that in the import process. Any way I can skip that?
It is being truncated because the column in the table is set to a VARCHAR(integer).
if the integer there is shorter than the text you are importing, it will get truncated.
You either need to increase the length of the varchar, or shorten your text before import.
Related
I'm trying to import existing database file into an empty SQL database with the following command:
mysql -u username -p'password' db_name < dbfile.sql
but I get following Error:
ERROR 1146 (42S02) at line 1: Table 'db_name.oc_address' doesn't exist
I know that oc_address is a table name inside the SQL file, but I don't know what to do to import it correctly, I searched the web and also stack-overflow, found nothing on this error.
Download the actual opencart zip file
https://www.opencart.com/index.php?route=cms/download/download&download_id=62
Unzip it
open folder
\upload\install
and
run opencart.sql
if you have installed extensions that have need their own sql, you have to run their sql as well
After that run you backup file
To export an entire database and then load it into another server, your best bet is to use the mysqldump command line utility. Its export files contain the data definition language (tables, views, all that) for the database as well as the data.
You can also get it to export just the definitions.
mysqldump --no-data -u username -p'password' db_name > opencartddl.sql
Then you can import that file first, then your data file.
Or, you may be able to stand up a new, empty, Opencart instance and use its UI to import your data.
It's probably wise to avoid trying to write replacement DDL yourself if you can get a tool like mysqldump to do it.
when I try to import my exported dump file (specifically my database/schema) to other computer, I got this error;
D:\CAPSTONE SYSTEM\MyDataBaseCAPSTONE\Dump20150922\schm_capstonesystem_routines.sql does not contain schema/table information
16:58:55 Restoring schm_capstonesystem (employee_entry)
Running: mysql.exe --defaults-file="c:\users\kc\appdata\local\temp\tmpnowdu_.cnf" --protocol=tcp --host=localhost --user=root --port=3306 --default-character-set=utf8 --comments --database=schm_capstonesystem < "D:\CAPSTONE SYSTEM\MyDataBaseCAPSTONE\Dump20150922\schm_capstonesystem_employee_entry.sql"
ERROR 1049 (42000): Unknown database 'schm_capstonesystem'
Operation failed with exitcode 1
Please help~! :-(
The error is Unknown database. You have to create the database 'schm_capstonesystem' first.
And then run the import of your dump.
In case you don't know how to create the schema as in the answer above:
right Click the left pane under Schema in SQL, click create schema,enter a name for the schema (preferably the same as the db your importing) click apply. Then do the import
I recently backed up a MySQL database in anticipation of a hardware swap out. After installing MySQL on the new hardware I attempted to import the database via MySQL WorkBench. All the tables imported correctly from the complete dump until it hit the Results file – this file contains a number of large BLObs. The preceding Random file which contains smaller BLObs imported correctly. After a number of failures I went to an older dump of individual files with the following result:-
17:35:20 Restoring /media/Week 1/MySQL/Dump20141112/Physio_Results.sql
Running: mysql --defaults-extra-file="/tmp/tmpydQEsK/extraparams.cnf" --host=localhost --user=root --port=3306 --default-character-set=utf8 --comments < "/media/Week 1/MySQL/Dump20141112/Physio_Results.sql"
ERROR 2006 (HY000) at line 51: MySQL server has gone away
Operation failed with exitcode 1
17:35:21 Import of /media/Week 1/MySQL/Dump20141112/Physio_Results.sql has finished with 1 errors
17:36:11 Restoring /media/Week 1/MySQL/Dump20141112/Physio_Session.sql
Running: mysql --defaults-extra-file="/tmp/tmpnAksEb/extraparams.cnf" --host=localhost --user=root --port=3306 --default-character-set=utf8 --comments < "/media/Week 1/MySQL/Dump20141112/Physio_Session.sql"
17:36:12 Import of /media/Week 1/MySQL/Dump20141112/Physio_Session.sql has finished
So obviously the larger BLObs are causing an issue but how do I overcome this? The documentation I looked at only says:-
Error Code: 2006 MySQL server has gone away
which adds nothing!
After increasing maxallowedpacket and netbufferlength I got slightly further, see below. I am unsure where to go next as I seem unable to restore data that I was allowed to export!
16:19:45 Restoring Physio (Results)
Running: mysql --defaults-extra-file="/tmp/tmpyme2U4/extraparams.cnf" --host=localhost --user=root --port=3306 --default-character-set=utf8 --comments < "/home/mjh/Desktop/Dump20141112/Physio_Results.sql"
ERROR 1118 (42000) at line 53: Row size too large (> 8126). Changing some columns to TEXT or BLOB or using ROW_FORMAT=DYNAMIC or ROW_FORMAT=COMPRESSED may help. In current row format, BLOB prefix of 768 bytes is stored inline.
Operation failed with exitcode 1
16:19:54 Import of /home/mjh/Desktop/Dump20141112 has finished with 1 errors
I have a text file with a single word per line. Example:
By
the
shore
of
the
ocean
I want import them into a sql database called words.sql.
When I try to do it, I get this error:
mysql> use words;
Database changed
mysql> LOAD DATA LOCAL INFILE '/home/admin/words/words.txt' INTO TABLE words;
ERROR 1148 (42000): The used command is not allowed with this MySQL version
I launch mysql with this command:
mysql -u root -p -D words --local-infile=0
What can I do to import this text file into my database? What am I doing wrong?
I have another text files of famous quotes I want to put into a database called quotes but need to figure out how to tackle this before I go any farther.
Thank you in advance.
(I have gotten these commands from various other websites in case you were wondering how I came about using these commands)
I have sql file which size is 2.36GB,
after importing 500 MB data,
my system is shutdown due to power supply die, how can i continue after 500 MB data,
is there any solution?
I used this command:-
mysql -uroot -p client_data < C:/Users/uradeshk/Desktop/client_data.sql
As far as i know, you have to restart. Since its a import of a sql file, you have to take the entire file.
Clear the table you inserted 500 MB into, unless it has DROP , CREATE statement in the file, then it will drop the table before it creates a new one.
TRUNCATE client_data;
mysql -uroot -p client_data < C:/Users/uradeshk/Desktop/client_data.sql