MySQL export table to text file fields name - mysql

let's say I have the following table in MySQL
create table test_tbl
(
col1 varchar(100),
col2 varchar(100),
amount int,
created datetime
)
Data insert
Insert into test_tbl values('unu', 'doi', 10, '05/01/2015');
Insert into test_tbl values('patru', trei', 400, '04/01/2015');
I need export all the data from that table in the following format. The file should be txt file.
"col1"="unu","col2"="doi","amount"="10","created"="05/01/2015"
"col1"="patru","col2"="trei","amount"="400","created"="04/01/2015"
So the logic is:
Each column name with value separated by comma.
Does it possible get such result in MySQL ?

For export the table's data-
SELECT CONCAT('"col1"="',col1,'","col2"="',col2,'","amount"="',amount,'","created"="',DATE_FORMAT(created,'%d/%m/%Y'),'"') t FROM test_tbl INTO OUTFILE '/tmp/test.txt' CHARACTER SET latin1 FIELDS ENCLOSED BY '' LINES TERMINATED BY '\r\n';
For import the table from csv
mysql> CREATE TABLE `test_tbl` (
-> `col1` varchar(100) DEFAULT NULL,
-> `col2` varchar(100) DEFAULT NULL,
-> `amount` int DEFAULT NULL,
-> `created` datetime DEFAULT NULL
-> ) ENGINE=InnoDB DEFAULT CHARSET=latin1
-> ;
Query OK, 0 rows affected (0.44 sec)
mysql> load data local infile 'test.txt' into table test_tbl fields terminated by ',' ENCLOSED BY '"' lines terminated by '\r\n' (#col1, #col2,#col3,#col4)
-> set col1 = substr(#col1,8), col2 = substr(#col2,8), amount = substr(#col3,10), created = str_to_date(substr(#col4,11), '%d/%m/%Y');
Query OK, 2 rows affected (0.09 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select * from test_tbl;
+-------+------+--------+---------------------+
| col1 | col2 | amount | created |
+-------+------+--------+---------------------+
| unu | doi | 10 | 2015-01-05 00:00:00 |
| patru | trei | 400 | 2015-01-04 00:00:00 |
+-------+------+--------+---------------------+
2 rows in set (0.00 sec)

Maybe this could work.
Use CONCAT to build a string like this
SELECT
CONCAT('"col1"="',col1,'","col2"="',col2,'","amount"="',amount,'","created"="',created,'"') t
FROM test_tbl;
Then you can also dump it to a text file using INTO OUTFILE.
SELECT
CONCAT('"col1"="',col1,'","col2"="',col2,'","amount"="',amount,'","created"="',created,'"') t
FROM test_tbl
INTO OUTFILE 'C:/yourtextfile.txt'
CHARACTER SET latin1
FIELDS ENCLOSED BY ''
LINES TERMINATED BY '\r\n';
Since CONCAT only has 1 row, you dont need to enclose any columns/fields with value since they are customised. Only a line break is used to terminate each ROW.
Hope it works!

Related

Timestamp field only on insert in MariaDB, combined with 'LOAD DATA LOCAL INFILE' data load

I want a timestamp field in MySQL table, to be set only on inserts, not on updates. The table created like that:
CREATE TABLE `test_insert_timestamp` (
`key` integer NOT NULL,
`value` integer NOT NULL,
`insert_timestamp` timestamp DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`key`)
);
The data is loaded with this sentence (need to be used LOAD DATA LOCAL INFILE):
LOAD DATA LOCAL INFILE
"inserts_test_timestamp1.txt"
REPLACE
INTO TABLE
`test_insert_timestamp`
FIELDS TERMINATED BY ';'
Note: I need to use REPLACE option, not matter why.
The content of inserts_test_timestamp**1**.txt been:
1;2
3;4
I have another file inserts_test_timestamp**2**.txt been:
3;4
5;6
What I wont is:
if I load file inserts_test_timestamp**1**.txt then the field insert_timestamp is set (that is ok with the code)
if I load inserts_test_timestamp**2**.txt, record (3;4) don't change field insert_timestamp already set, but record (5;6) set new insert_timestamp.
But no way. Both records are timestamped with same value, instead of left (3;4) with the old timestamp.
I'm working on MariaDB 5.5.52 database over CentOS 7.3 release. Think that MariaDB version is important, but I can't change that.
You can divide the process in two steps:
MariaDB [_]> DROP TABLE IF EXISTS
-> `temp_test_insert_timestamp`,
-> `test_insert_timestamp`;
Query OK, 0 rows affected (0.00 sec)
MariaDB [_]> CREATE TABLE IF NOT EXISTS `test_insert_timestamp` (
-> `key` integer NOT NULL,
-> `value` integer NOT NULL,
-> `insert_timestamp` timestamp DEFAULT CURRENT_TIMESTAMP,
-> PRIMARY KEY (`key`)
-> );
Query OK, 0 rows affected (0.00 sec)
MariaDB [_]> CREATE TABLE IF NOT EXISTS `temp_test_insert_timestamp` (
-> `key` integer NOT NULL,
-> `value` integer NOT NULL,
-> `insert_timestamp` timestamp DEFAULT CURRENT_TIMESTAMP,
-> PRIMARY KEY (`key`)
-> );
Query OK, 0 rows affected (0.00 sec)
MariaDB [_]> LOAD DATA LOCAL INFILE '/path/to/file/inserts_test_timestamp1.txt'
-> INTO TABLE `test_insert_timestamp`
-> FIELDS TERMINATED BY ';'
-> (`key`, `value`);
Query OK, 2 rows affected (0.00 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
MariaDB [_]> SELECT
-> `key`,
-> `value`,
-> `insert_timestamp`
-> FROM
-> `test_insert_timestamp`;
+-----+-------+---------------------+
| key | value | insert_timestamp |
+-----+-------+---------------------+
| 1 | 2 | 2018-03-20 00:49:38 |
| 3 | 4 | 2018-03-20 00:49:38 |
+-----+-------+---------------------+
2 rows in set (0.00 sec)
MariaDB [_]> DO SLEEP(5);
Query OK, 0 rows affected (5.00 sec)
MariaDB [_]> LOAD DATA LOCAL INFILE '/path/to/file/inserts_test_timestamp2.txt'
-> INTO TABLE `temp_test_insert_timestamp`
-> FIELDS TERMINATED BY ';'
-> (`key`, `value`);
Query OK, 2 rows affected (0.00 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
MariaDB [_]> SELECT
-> `key`,
-> `value`,
-> `insert_timestamp`
-> FROM
-> `temp_test_insert_timestamp`;
+-----+-------+---------------------+
| key | value | insert_timestamp |
+-----+-------+---------------------+
| 3 | 4 | 2018-03-20 00:49:43 |
| 5 | 6 | 2018-03-20 00:49:43 |
+-----+-------+---------------------+
2 rows in set (0.00 sec)
MariaDB [_]> REPLACE INTO `test_insert_timestamp`
-> SELECT
-> `ttit`.`key`,
-> `ttit`.`value`,
-> `tit`.`insert_timestamp`
-> FROM
-> `temp_test_insert_timestamp` `ttit`
-> LEFT JOIN `test_insert_timestamp` `tit`
-> ON `ttit`.`key` = `tit`.`key`;
Query OK, 2 rows affected (0.01 sec)
Records: 2 Duplicates: 0 Warnings: 0
MariaDB [_]> SELECT
-> `key`,
-> `value`,
-> `insert_timestamp`
-> FROM
-> `test_insert_timestamp`;
+-----+-------+---------------------+
| key | value | insert_timestamp |
+-----+-------+---------------------+
| 1 | 2 | 2018-03-20 00:49:38 |
| 3 | 4 | 2018-03-20 00:49:38 |
| 5 | 6 | 2018-03-20 00:49:43 |
+-----+-------+---------------------+
3 rows in set (0.00 sec)
MariaDB [_]> TRUNCATE TABLE `temp_test_insert_timestamp`;
Query OK, 0 rows affected (0.00 sec)
I implement the solution in this post: MySQL LOAD DATA INFILE with ON DUPLICATE KEY UPDATE
This solution not only allows me to get the insert_timestamp, but also a field with update_timestamp:
# --- Create temporary table ---
CREATE TEMPORARY TABLE temporary_table LIKE test_insert_timestamp;
# --- Delete index to speed up
DROP INDEX `PRIMARY` ON temporary_table;
DROP INDEX `STAMP_INDEX` ON temporary_table;
# --- Load data in temporary table
LOAD DATA LOCAL INFILE "./inserts_test_timestamp1.txt"
INTO TABLE temporary_table
FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"'
IGNORE 1 LINES
SET
insert_timestamp = CURRENT_TIMESTAMP(),
update_timestamp = NULL
;
# --- Insert data in temporary table ---
INSERT INTO test_insert_timestamp
SELECT * FROM temporary_table
ON DUPLICATE KEY UPDATE
update_timestamp = CURRENT_TIMESTAMP();
# --- Drop temporary
DROP TEMPORARY TABLE temporary_table;
Thanks for help !

mysql change HEX number to decimal in LOAD DATA LOCAL INFILE

all,
I have a test.csv file with Id in HEX numbers as below:
Id, DateTime,...
66031851, ...
2E337E4E, ...
The table_test is created using MYSQL as below:
CREATE TABLE table_test(
Id BIGINT NOT NULL,
DateTime DATETIME NOT NULL,
OtherId BIGINT NOT NULL,
...,
PRIMARY KEY (Id, DateTime, OtherId)
)ENGINE=InnoDB DEFAULT CHARSET=utf8;
The created table_test is as below:
+---------------------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+---------------------+--------------+------+-----+---------+-------+
| Id | bigint(20) | NO | PRI | NULL | |
| DateTime | datetime | NO | PRI | NULL | |
I am using MYSQL as below to load the data in a table:
load data local infile 'test.csv' replace into table table_test character set utf8mb4 fields terminated by ',' ENCLOSED BY '\"' lines terminated by '\n' ignore 1 lines SET Id=CONV(Id, 16, 10);
Also tried:
SET Id=cast(CONV(Id, 16, 10) AS UNSIGNED)
and
SET Id=cast(CONV(CONVERT(Id,CHAR), 16, 10) AS UNSIGNED)
But the HEX numbers with letters like "2E337E4E" do not work. They become some very big number which is bigger than a BIGINT. But when I try MYSQL below:
select CONV('2E337E4E', 16, 10);
It works as expected with the correct result "775126606". So I think I miss a step in "LOAD DATA" to make the Id as string for the CONV(). Searched for some time, but did not find a solution.
Anyone has some idea or hint?
Thanks very much
Zhihong
The typical solution for this type of problem is to load the value into a user-defined-variable, then do the conversion in a SET statement.
Something like this should work for you:
load data local infile 'test.csv'
replace into table table_test
character set utf8mb4
fields terminated by ','
ENCLOSED BY '\"'
lines terminated by '\n'
ignore 1 lines
(#Id, `DateTime`, <explicitly list all other columns>)
SET Id=CONV(#Id, 16, 10);

Loading csv into mysql selecting columns

I am trying to learn how to use efficiently mysql. Now, I want to load into a mysql database a csv containing the bibliography of an author. This is the code I have generating the database and trying to upload the file:
USE stephenkingbooks;
DROP TABLE IF EXISTS stephenkingbooks;
CREATE TABLE stephenkingbooks
(
`id` int unsigned NOT NULL auto_increment,
`original_title` varchar(255) NOT NULL,
`spanish_title` varchar(255) NOT NULL,
`year` decimal(4) NOT NULL,
`pages` decimal(10) NOT NULL,
`in_collection` enum('Y','N') NOT NULL DEFAULT 'N',
`read` enum('Y','N') NOT NULL DEFAULT 'N',
PRIMARY KEY (id)
);
LOAD DATA LOCAL INFILE '../files/unprocessed_sking.csv'
INTO TABLE stephenkingbooks (column1, column2, column4, column3)
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS;
The csv file is format like this:
Carrie,Carrie,Terror,199,19745,"En 1976, el director de cine Brian de Palma hizo la primera pelĂ­cula basada en la novela.7 3"
My idea is to load only the two first columns corresponding to the original_title, the second being the spanish title (the same in mysql and the csv) and after the column3 in csv would be the pages and the column4 the year.
In addition, for the year column, I only want to take the 4 first numbers of the field because I have some of them with a reference that is not part of the year. For example, Carrie was released in 1974 but the csv includes a 5 in the date that I do not want to consider.
My problem is I am not able to obtain what I want without errors in my terminal... any suggestion?
13.2.6 LOAD DATA INFILE Syntax
...
You must also specify a column list if the order of the fields in the
input file differs from the order of the columns in the table.
...
Try:
mysql> LOAD DATA INFILE '../files/unprocessed_sking.csv'
-> INTO TABLE `stephenkingbooks`
-> FIELDS TERMINATED BY ','
-> ENCLOSED BY '"'
-> LINES TERMINATED BY '\r\n'
-> (`original_title`, `spanish_title`, #`genre`, #`pages`, #`year`)
-> SET `year` = LEFT(#`year`, 4), `pages` = #`pages`;
Query OK, 1 row affected (0.00 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 0
mysql> SELECT
-> `id`,
-> `original_title`,
-> `spanish_title`,
-> `year`,
-> `pages`,
-> `in_collection`,
-> `read`
-> FROM `stephenkingbooks`;
+----+----------------+---------------+------+-------+---------------+------+
| id | original_title | spanish_title | year | pages | in_collection | read |
+----+----------------+---------------+------+-------+---------------+------+
| 1 | Carrie | Carrie | 1974 | 199 | N | N |
+----+----------------+---------------+------+-------+---------------+------+
1 row in set (0.00 sec)

Set timestamp on insert when loading CSV [duplicate]

This question already has answers here:
How can i add date as auto update when import data from csv file?
(2 answers)
Closed 9 years ago.
I have a Timestamp field that is defined to be automatically updated with the CURRENT_TIMESTAMP value.
It works fine when I fire a query, but when I import a csv (which I'm forced to do since one of the fields is longtext) , the update does not work.
I have tried to:
Give timestamp column as now() function in csv
Manually enter timestamp like 2013-08-08 in the csv
Both the approaches do not work
From what I gather, after updating your question, is that you're actually updating rows using a CSV, and expect the ON UPDATE clause to set the value of your timestamp field to be updated.
Sadly, when loading a CSV into a database you're not updating, but inserting data, and overwriting existing records. At least, when using a LOCAL INFILE, if the INFILE isn't local, the query will produce an error, if it's a local file, these errors (duplicates) will produce warnings and the operation will continue.
If this isn't the case for you, perhaps consider following one of the examples on the doc pages:
LOAD DATA INFILE 'your.csv'
INTO TABLE tbl
(field_name1, field_name2, field_name3)
SET updated = NOW()
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY ('\n');
Just in case you can't/won't/forget to add additional information, loading a csv int a MySQL table is quite easy:
LOAD DATA
LOCAL INFILE '/path/to/file/filename1.csv'
INTO TABLE db.tbl
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(`field_name1`,`field_name2`,`field_name3`)
If you create a table along the lines of:
CREATE TABLE tbl(
id INT AUTO_INCREMENT PRIMARY KEY, -- since your previous question mentioned auto-increment
field_name1 VARCHAR(255) NOT NULL PRIMARY KEY, -- normal fields
field_name2 INTEGER(11) NOT NULL PRIMARY KEY,
field_name3 VARCHAR(255) NOT NULL DEFAULT '',
-- when not specified, this field will receive current_timestamp as value:
inserted TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
-- if row is updated, this field will hold the timestamp of update-time
updated TIMESTAMP NOT NULL DEFAULT 0
ON UPDATE CURRENT_TIMESTAMP
)ENGINE = INNODB
CHARACTER SET utf8 COLLATE utf8_general_ci;
This query is untested, so please be careful with it, it's just to give a general idea of what you need to do to get the insert timestamp in there.
This example table will work like so:
> INSERT INTO tbl (field_name1, field_name2) VALUES ('foobar', 123);
> SELECT FROM tbl WHERE field_name1 = 'foobar' AND field_name2 = 123;
This will show:
+---------------------+---------------------+---------------------+---------------------+---------------------+---------------------+
| id | field_name1 | field_name2 | field_name3 | inserted | updated |
+---------------------+---------------------+---------------------+---------------------+---------------------+---------------------+
| 1 | foobar | 123 | | 2013-08-07 00:00:00 | 0000-00-00 00:00:00 |
+---------------------+---------------------+---------------------+---------------------+---------------------+---------------------+
As you can see, because we didn't explicitly insert a value into the last three fields, MySQL used their DEFAULT values. For field_name3, an empty string was used, for inserted, the default was CURRENT_TIMESTAMP, for updated the default value was 0 which, because the field-type is TIMESTAMP is represented by the value 0000-00-00 00:00:00. If you were to run the following query next:
UPDATE tbl
SET field_name3 = 'an update'
WHERE field_name1 = 'foobar'
AND field_name2 = 123
AND id = 1;
The row would look like this:
+---------------------+---------------------+---------------------+---------------------+---------------------+---------------------+
| id | field_name1 | field_name2 | field_name3 | inserted | updated |
+---------------------+---------------------+---------------------+---------------------+---------------------+---------------------+
| 1 | foobar | 123 | an update | 2013-08-07 00:00:00 | 2013-08-07 00:00:20 |
+---------------------+---------------------+---------------------+---------------------+---------------------+---------------------+
that's all. Some basics can be found here, on mysqltutorial.org, but best keep the official manual ready. It's not bad once you get used to it.
Perhaps this question might be worth a quick peek, too.

Importing CSV using LOAD DATA INFILE quote problem

I'm trying to get this CSV file that I exported from excel loaded into my database and I can't seem to get the formatting correct no matter what I try.
Here is the SQL:
LOAD DATA INFILE 'path/file.csv'
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(column1, column2, column3);
This works fine but then I run into trouble when the end of a line (column 3) ends in a quote. For example:
Actual value: These are "quotes"
Value in CSV: "These are ""quotes"""
What happens is that I will get an extra quote on that value in the database and also any additional lines until it reaches another quote in the CSV. Any ideas on how to solve this?
Hmm. I tried to duplicate this problem but can't. Where does my data differ from yours? Can you provide sample data to duplicate this? Here's what I did:
> cat /tmp/data.csv
"aaaa","bbb ""ccc"" ddd",xxx
xxx,yyy,"zzz ""ooo"""
foo,bar,baz
mysql> CREATE TABLE t2 (a varchar(20), b varchar(20), c varchar(20));
Query OK, 0 rows affected (0.01 sec)
mysql> LOAD DATA INFILE '/tmp/data.csv' INTO TABLE t2 FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' (a, b, c);
Query OK, 3 rows affected (0.00 sec)
Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select * from t2;
+------+---------------+-----------+
| a | b | c |
+------+---------------+-----------+
| aaaa | bbb "ccc" ddd | xxx |
| xxx | yyy | zzz "ooo" |
| foo | bar | baz |
+------+---------------+-----------+
3 rows in set (0.00 sec)
Looks ok to me(?)
Also note that if you're working on a Windows platform you might need to use
LINES TERMINATED BY '\r\n' instead.