How to resolve Mysql error 126? - mysql

i'm new to mysql database. i'm trying to create maintable by joining existing two tables in MySQL. The following command i had used. But it throws the following error.
create table maintable as select * from table1 union select * from table2;
Error 126 (HY000): incorrect key file for table 'c:\temp'; try to repair it
i had googled and increased tmp_table_size to 2G.
My configuration file looks like this.
[client]
port=3306
[mysql]
default-character-set=UTF8
[mysqld]
port=3306
max_allowed_packet=128M
basedir="C:/Program Files/MySQL/MySQL Server 5.5/"
datadir="C:/ProgramData/MySQL/MySQL Server 5.5/Data/"
character-set-server=UTF8
default-storage-engine=INNODB
sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"
max_connections=100
query_cache_size=0
table_cache=2G
tmp_table_size=2G
max_heap_table_size=2G
thread_cache_size=32
myisam_max_sort_file_size=100G
myisam_sort_buffer_size=126M
read_buffer_size=128K
read_rnd_buffer_size=612K
sort_buffer_size=566K
innodb_additional_mem_pool_size=512M
innodb_flush_log_at_trx_commit=1
innodb_log_buffer_size=50M
innodb_buffer_pool_size=127M
innodb_log_file_size=24M
But nothing seems to resolve the error... Your help is really appreciated. Thank you

If both tables have same no. of columns and in same order as it will be as you are using union which can work only if no. of columns will be same then you can use any one approach out of below, which will be faster than simple insert method-
Through dump:
step1: take table1 backup through dump with structure and data.
Step2: take table2 dump of only data.
Step3: restore table1.
Step4: restore table2 data only in table1.
Through export/import method:
Step1: take both table backup in csv.
select * INTO OUTFILE 'd:\\backup\\table1.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n' FROM table1;
select * INTO OUTFILE 'd:\\backup\\table1.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n' FROM table2;
Step3: now import both csv data into table, first create table-
LOAD DATA LOCAL INFILE 'd:\\backup\\table1.csv' INTO TABLE mytable FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' ignore 1 lines;
LOAD DATA LOCAL INFILE 'd:\\backup\\table2.csv' INTO TABLE mytable FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' ignore 1 lines;

Related

MySQL is not updating when running this query

load data infile "C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/newdata.csv"
into table prodlookupfile
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 LINES;
This executes but the new data is NOT uploaded into the table in MySQL, what am I missing?

MySql OUTFILE is not working in rails 3

In one of my rails action i want to create a csv file from a table using MySql OUTFILE.
path = "#{Rails.root}/public/outfile.csv"
query_string = "SELECT * INTO OUTFILE '#{path}' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' FROM temp_csv_186;"
ActiveRecord::Base.connection.execute(query_string)
But every time its showing the error.
Mysql2::Error: Can't create/write to file '/home/user/Projects/Application/public/outfile.csv' (Errcode: 13): SELECT * INTO OUTFILE '/home/user/Projects/Application/public/outfile.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' FROM temp_csv_186;
Has whichever user Mysql runs under got write permissions for /home/user/Projects/Application/public/ ?

Efficiently loading a csv file into a MySQL table

The current way I am loading the file is:
load data local infile 'file_name' into table tableA
fields terminated by ',' enclosed by '"' lines terminated by '\n';
Is the optimal way to load in a table in a unix machine. Does it create the optimal table size? I want a table that takes up the smallest space.
MyISAM
If the table is MyISAM, you should do the following:
set bulk_insert_buffer_size = 1024 * 1024 * 256;
alter table tableA disable keys;
load data local infile 'file_name' into table tableA
fields terminated by ',' enclosed by '"' lines terminated by '\n';
alter table tableA enable keys;
InnoDB
If the table is InnoDB, you should do the following:
set bulk_insert_buffer_size = 1024 * 1024 * 256;
load data local infile 'file_name' into table tableA
fields terminated by ',' enclosed by '"' lines terminated by '\n';
No only will this take up the least space (loading an empty table), but the rows will be buffered in a treelike structure in memory based on the bulk_insert_buffer_size for caching the data quicker during the reload.
If you are worried about ibdata1 exploding, you need to convert all InnoDB tables to use innodb_file_per_table. Please use my InnoDB Cleanup Steps : Howto: Clean a mysql InnoDB storage engine?

hibernate + mysql + load data in file

Hi I am trying to load data from a file to Mysql DB using hibernate.
here is the query,
session.createSQLQuery("LOAD DATA INFILE E:/uploaded/NumSerie/NS/NumSerie.txt INTO TABLE prod CHARACTER SET latin1 FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n' IGNORE 1 LINES;").executeUpdate();
But i get the following error,
org.hibernate.QueryException: Space is not allowed after parameter prefix ':' [LOAD DATA INFILE E:/uploaded/NumSerie/NS/NumSerie.txt INTO TABLE prod CHARACTER SET latin1 FIELDS TERMINATED BY ';' LINES TERMINATED BY '
' IGNORE 1 LINES;]
at org.hibernate.engine.query.ParameterParser.parse(ParameterParser.java:92)
at org.hibernate.engine.query.ParamLocationRecognizer.parseLocations(ParamLocationRecognizer.java:75)
How can I rewrite this query so this is executed properly?
Thanks in advance!
Try creating a parameterised query
I'm no Hibernate guru but this could work:
session.createSQLQuery("LOAD DATA INFILE :file INTO TABLE prod CHARACTER SET latin1 FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n' IGNORE 1 LINES;")
.setString("file", "E:/uploaded/NumSerie/NS/NumSerie.txt")
.executeUpdate();

Dump MySQL view as a table with data

Say I have a view in my database, and I want to send a file to someone to create that view's output as a table in their database.
mysqldump of course only exports the 'create view...' statement (well, okay, it includes the create table, but no data).
What I have done is simply duplicate the view as a real table and dump that. But for a big table it's slow and wasteful:
create table tmptable select * from myview
Short of creating a script that mimics the behaviour of mysqldump and does this, is there a better way?
One option would be to do a query into a CSV file and import that. To select into a CSV file:
From http://www.tech-recipes.com/rx/1475/save-mysql-query-results-into-a-text-or-csv-file/
SELECT order_id,product_name,qty
FROM orders
INTO OUTFILE '/tmp/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
OK, so based on your CSV failure comment, start with Paul's answer. Make the following change to it:
- FIELDS TERMINATED BY ','
+ FIELDS TERMINATED BY ',' ESCAPED BY '\'
When you're done with that, on the import side you'll do a "load data infile" and use the same terminated / enclosed / escaped statements.
Same problem here my problem is that I want to export view definition (84 fields and millions of records) as a "create table" statement, because view can variate along time and I want an automatic process. So that's what I did:
Create table from view but with no records
mysql -uxxxx -pxxxxxx my_db -e "create table if not exists my_view_def as select * from my_view limit 0;"
Export new table definition. I'm adding a sed command to change table name my_view_def to match original view name ("my_view")
mysqldump -uxxxx -pxxxxxx my_db my_view_def | sed s/my_view_def/my_view/g > /tmp/my_view.sql
drop temporary table
mysql -uxxxx -pxxxxxx my_db -e "drop table my_view_def;"
Export data as a CSV file
SELECT * from my_view into outfile "/tmp/my_view.csv" fields terminated BY ";" ENCLOSED BY '"' LINES TERMINATED BY '\n';
Then you'll have two files, one with the definition and another with the data in CSV format.