Export MySQL to CSV, some columns with quotes and some without - mysql

I am exporting a MySQL table and I want to export the integer type columns without double quotes but the varchar type columns with double quotes. I need to do this to have the correct formatting for the next step in my work. Can this be done in MySQL? I know I could probably do this in a python script but the csv files are pretty large (>1 GB) so I think it might take awhile to do that. Anyway, is this possible using MySQL Queries?
Here's my current export script format:
SELECT
'column_name_1',
'column_name_2',
...
'column_name_n'
UNION ALL
SELECT *
FROM table
INTO OUTFILE 'table.csv'
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
If it helps, here is the table (more importantly, the types involved) I am trying to export:
+-------------------------+------------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------------------------+------------------+------+-----+---------+-------+
| field_1 | int(10) unsigned | NO | MUL | 0 | |
| field_2 | int(10) unsigned | NO | MUL | NULL | |
| field_3 | int(10) unsigned | NO | | NULL | |
| field_4 | char(1) | NO | | NULL | |
| field_5 | int(10) unsigned | NO | | NULL | |
| field_6 | varchar(4) | NO | | | |
| field_7 | char(1) | NO | | Y | |
| field_8 | varchar(20) | NO | | | |
| field_9 | varchar(200) | NO | | | |
+-------------------------+------------------+------+-----+---------+-------+
EDIT 1: I tried OPTIONALLY ENCLOSED BY '"' as suggested in an answer, but when I add that to the script, it double quotes every column, not just the string (or varchar) columns. Any idea why it might do this?

use the OPTIONALLY ENCLOSED BY clause.
SELECT *
FROM table
INTO OUTFILE 'table.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n';
The OPTIONALLY modifier makes it do this only for string columns.
You also need to leave out the subquery that returns the header line. The problem is that all rows of a union need to have the same types in the columns, so it's converting all the non-strings to strings to match the header line.

Related

Export MySQL json data type field directly to CSV

Consider MySQL database (8.x) with JSON field:
mysql> desc users_health;
+------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------+--------------+------+-----+---------+----------------+
| id | int unsigned | NO | PRI | NULL | auto_increment |
| user_id | int | NO | | NULL | |
| data | json | YES | | NULL | |
| created_at | timestamp | YES | | NULL | |
| updated_at | timestamp | YES | | NULL | |
+------------+--------------+------+-----+---------+----------------+
In this answer there is a way to export colums to csv fields:
How to output MySQL query results in CSV format?
What I would like to achieve is to export ONLY data in data column to CSV. This data is well organized and composed of key-value pairs like so:
{"email": "x#example.com", "user_id": 100, "ivr_used": false, "ivr_enabled": true, "callerids_used": true, "call-queue_used": false ...}
I would like that to be exported to CSV file looking like so:
+--------------+-----------+----------+-------------+----------+
| email | user_id | ivr_used | ivr_enabled | ... |
+--------------+----------+-----------+-------------+----------+
| x#example.com| 100 | false | true | ... |
+--------------+----------+-----------+-------------+----------+
| y#example.com| 101 | true | true | ... |
+--------------+----------+-----------+-------------+----------+
....
Is this even possible using MySQL-only solution, or do I have to fetch the data and process it somewhere else?
Partially valid solution for me would be possibility to export the data to JSON file.
Well its a bit like if the data were in columns, you have to extract each value you want from each rows data column
SELECT JSON_EXTRACT(data, '$.email') as email,
JSON_EXTRACT(data, '$.user_id') as user_id,
. . .
FROM users_health
WHERE foo = 'bar'
INTO OUTFILE '/var/lib/mysql-files/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';

Import CSV table and skip first column from mysql table

I got one table like this:
| Field | Type | Null | Key | Default | Extra |
+------------+-------------+------+-----+---------+----------------+
| my_id | int(5) | NO | PRI | NULL | auto_increment |
| col1 | varchar(20) | YES | | NULL | |
| col2 | varchar(20) | YES | | NULL | |
| col3 | varchar(20) | YES | | NULL | |
and I want to import a CSV file into this table. Problem is that in CSV I don't have my_id column, so I need to insert data beginning from 2nd column (col1) as the first must be and kind of count of each row.
Edit: I just did the basic import and mysql "removed" my first column from CSV:
LOAD DATA LOCAL INFILE "/home/bruno/myfield.csv"
INTO TABLE teste
FIELDS TERMINATED BY ','
ESCAPED BY ''
LINES TERMINATED BY '\n';
When I import I got:
| my_id | col2 | col3 | col4 |
+------------+----------------------+------------------+-----------------+
| 1 | lorem ipsum | SER1 | testingmyfield4 |
| 2 | dolor emet | SER1 | testingmyfield4 |
You can use a sql query to get the columns and save it in file like the query below
SELECT col1,col2,.. FROM tablename INTO OUTFILE
'location where u want to save the file'
or you can the try the tools which will be available in internet like mysqlyog,mysqlworkbench
Hope you find this answer helpfull

mySQL Command Line Import CSV Gives me NULL

I am used to using PHPmyadmin to manage my mySQL databases but I am starting to use the command line a lot more. I am trying to import a CSV file into a table called source_data that looks like this...
+-----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| code | varchar(10) | YES | | NULL | |
| result | char(1) | YES | | NULL | |
| source | char(1) | YES | | NULL | |
| timestamp | varchar(30) | YES | | NULL | |
+-----------+-------------+------+-----+---------+----------------+
And my CSV file looks like this...
code,result,source,timestamp
123 ABC,,,
456 DEF,,,
789 GHI,,,
234 JKL,,,
567 MNO,,,
890 PQR,,,
I am using this command..
LOAD DATA INFILE '/home/user1/data.csv' INTO TABLE source_data FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS;
This inserts the correct number of rows but each one just says NULL, where am I going wrong?
Since the CSV file doesn't have all the table columns (it's missing the id column), you need to specify the columns that they should be written into explicitly.
LOAD DATA INFILE '/home/user1/data.csv'
INTO TABLE source_data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(code, result, source, timestamp);
Well, I hope this wouldn't be a case still, You said table name is source_data and in command the name is data

Load data from CSV to MySQL

I am trying to insert into mysql table from a csv file using the following command in linux:
LOAD DATA LOCAL INFILE '\/home\/abc\/mapping.csv' INTO TABLE db1.ackno_rollno_mapping FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
csv file is like:
166 D/O-5
208 W/O-8
230
231
236 W/O-9
245 W/O-10
8604 P/O-142
8623 W/O-730
8629 W/O-731
Table structure is:
mysql> desc db1.ackno_rollno_mapping;
+--------+-------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+--------+-------------+------+-----+---------+-------+
| ackNo | varchar(45) | NO | PRI | NULL | |
| rollNo | varchar(45) | YES | | NULL | |
+--------+-------------+------+-----+---------+-------+
2 rows in set (0.00 sec)
The records are inserted, but some parts of both ackNo and rollNo are missing.
Like:
| 58 | W/O-728
| 67 | W/O-729
| 04 | P/O-142
| 23 | W/O-730
| 29 | W/O-731
I tried changing the datatype of ackNo from varchar(45) to integer, but still the same issue...
Am i missing something??
Thanks in Advance
--------EDIT---------
my csv file is like:
13,W/O-1
14,W/O-2
20,P/O-1
60,D/O-1
61,W/O-3
62,W/O-4
I tried below query also (removing the ENCLOSED BY '"')
LOAD DATA LOCAL INFILE '\/home\/abc\/ackno_rollno_mapping.csv' INTO TABLE db1.ackno_rollno_mapping FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
But the issue still is same. I noticed that the first digit of ackNo is removed in the table.
mysql> select * from db1.ackno_rollno_mapping;
+-------+-----------+
| ackNo | rollNo |
+-------+-----------+
| 3 | W/O-1
| 4 | W/O-2
| 0 | P/O-1
| 0 | D/O-1
| 1 | W/O-3
| 2 | W/O-4
Please advice.
Thanks

MySql file import (LOAD DATA LOCAL INFILE)

I have a table called city:
+------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------+--------------+------+-----+---------+----------------+
| id | bigint(20) | NO | PRI | NULL | auto_increment |
| country_id | mediumint(9) | NO | MUL | NULL | |
| region_id | bigint(20) | NO | MUL | NULL | |
| city | varchar(45) | NO | | NULL | |
| latitude | float(18,2) | NO | | NULL | |
| longitude | float(18,2) | NO | | NULL | |
| timezone | varchar(10) | NO | | NULL | |
| dma_id | mediumint(9) | YES | | NULL | |
| code | varchar(4) | YES | | NULL | |
+------------+--------------+------+-----+---------+----------------+
I have a simple file (just a test file) to import:
"id","country_id","region_id","city","latitude","longitude","timezone","dma_id","code"
42231,1,833,"Herat","34.333","62.2","+04:30",0,"HERA"
5976,1,835,"Kabul","34.517","69.183","+04:50",0,"KABU"
42230,1,852,"Mazar-e Sharif","36.7","67.1","+4:30",0,"MSHA"
42412,2,983,"Korce","40.6162","20.7779","+01:00",0,"KORC"
5977,2,1011,"Tirane","41.333","19.833","+01:00",0,"TIRA"
5978,3,856,"Algiers","36.763","3.051","+01:00",0,"ALGI"
5981,3,858,"Skikda","36.879","6.907","+01:00",0,"SKIK"
5980,3,861,"Oran","35.691","-0.642","+01:00",0,"ORAN"
I run this command:
LOAD DATA LOCAL INFILE 'cities_test.txt' INTO TABLE city FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;
Output:
Query OK, 0 rows affected (0.00 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
No records are inserted and I don't know why.
Any ideas?
Thanks!
Jamie
Worked it out. Silly mistake.
Had to change this:
LINES TERMINATED BY '\r\n'
To this:
LINES TERMINATED BY '\n'
:-)
I had the same problem, but I try this, erase the first row
`("id","country_id","region_id","city,"latitude","longitude",
"timezone","dma_id","code")` in your file to import.
Now when you run the comand write like this
mysql> LOAD DATA LOCAL
INFILE 'cities_test.txt'
INTO TABLE city FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
And that is all.
It worked for me :D
I had same issue on mac,
Try this if you are using mac
LOAD DATA INFILE 'sqlScript1.txt' INTO TABLE USER
FIELDS TERMINATED BY ',' LINES STARTING BY '\r';
For me, what worked on a mac was
LOAD DATA LOCAL
INFILE 'cities_test.txt'
INTO TABLE city FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r';
Since Macs use carriage return for its line break you must use '/r'