Import CSV table and skip first column from mysql table - mysql

I got one table like this:
| Field | Type | Null | Key | Default | Extra |
+------------+-------------+------+-----+---------+----------------+
| my_id | int(5) | NO | PRI | NULL | auto_increment |
| col1 | varchar(20) | YES | | NULL | |
| col2 | varchar(20) | YES | | NULL | |
| col3 | varchar(20) | YES | | NULL | |
and I want to import a CSV file into this table. Problem is that in CSV I don't have my_id column, so I need to insert data beginning from 2nd column (col1) as the first must be and kind of count of each row.
Edit: I just did the basic import and mysql "removed" my first column from CSV:
LOAD DATA LOCAL INFILE "/home/bruno/myfield.csv"
INTO TABLE teste
FIELDS TERMINATED BY ','
ESCAPED BY ''
LINES TERMINATED BY '\n';
When I import I got:
| my_id | col2 | col3 | col4 |
+------------+----------------------+------------------+-----------------+
| 1 | lorem ipsum | SER1 | testingmyfield4 |
| 2 | dolor emet | SER1 | testingmyfield4 |

You can use a sql query to get the columns and save it in file like the query below
SELECT col1,col2,.. FROM tablename INTO OUTFILE
'location where u want to save the file'
or you can the try the tools which will be available in internet like mysqlyog,mysqlworkbench
Hope you find this answer helpfull

Related

Export MySQL json data type field directly to CSV

Consider MySQL database (8.x) with JSON field:
mysql> desc users_health;
+------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------+--------------+------+-----+---------+----------------+
| id | int unsigned | NO | PRI | NULL | auto_increment |
| user_id | int | NO | | NULL | |
| data | json | YES | | NULL | |
| created_at | timestamp | YES | | NULL | |
| updated_at | timestamp | YES | | NULL | |
+------------+--------------+------+-----+---------+----------------+
In this answer there is a way to export colums to csv fields:
How to output MySQL query results in CSV format?
What I would like to achieve is to export ONLY data in data column to CSV. This data is well organized and composed of key-value pairs like so:
{"email": "x#example.com", "user_id": 100, "ivr_used": false, "ivr_enabled": true, "callerids_used": true, "call-queue_used": false ...}
I would like that to be exported to CSV file looking like so:
+--------------+-----------+----------+-------------+----------+
| email | user_id | ivr_used | ivr_enabled | ... |
+--------------+----------+-----------+-------------+----------+
| x#example.com| 100 | false | true | ... |
+--------------+----------+-----------+-------------+----------+
| y#example.com| 101 | true | true | ... |
+--------------+----------+-----------+-------------+----------+
....
Is this even possible using MySQL-only solution, or do I have to fetch the data and process it somewhere else?
Partially valid solution for me would be possibility to export the data to JSON file.
Well its a bit like if the data were in columns, you have to extract each value you want from each rows data column
SELECT JSON_EXTRACT(data, '$.email') as email,
JSON_EXTRACT(data, '$.user_id') as user_id,
. . .
FROM users_health
WHERE foo = 'bar'
INTO OUTFILE '/var/lib/mysql-files/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';

Import CSV using LOAD DATA getting wrong values

I have a big csv (near 100mb) that I would like to import in a table with the following structure:
+-------------+------------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-------------+------------------+------+-----+---------+----------------+
| id | int(11) unsigned | NO | PRI | NULL | auto_increment |
| cep | varchar(255) | YES | MUL | NULL | |
| site | text | YES | | NULL | |
| cidade | text | YES | | NULL | |
| uf | text | YES | | NULL | |
| cepbase | text | YES | | NULL | |
| segmentacao | text | YES | | NULL | |
| area | text | YES | | NULL | |
| cepstatus | int(1) | YES | | NULL | |
| score | int(11) | NO | | NULL | |
| fila | int(11) | NO | | NULL | |
+-------------+------------------+------+-----+---------+----------------+
I was about to write some code to import but I've found a MySQL command that does the job to me. So I've write the following:
LOAD DATA LOCAL INFILE '/Users/user/Downloads/base.csv'
INTO TABLE cep_status_new
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS
(#id,#cep,#site,#cidade,#uf,#cepbase,#segmentacao,#area,#cepstatus,#score,#fila)
SET id=NULL, cep=#col1, site='GOD', cidade=#col6, uf=#col7, cepbase='-', segmentacao=#col9, cepstatus=#col2, area='BING', score=99999, fila=5;
To try this code, I've removed thousand lines from my CSV and let only 2 lines: header and an input example:
cep,status,gang,bang,random,mock,awesome,qwert,hero
01019000,0,00387,00388,3550308,SAO PAULO,SP,011,B2
The code runs without problem but my insert is pretty strange:
mysql> select * from cep_status_new;
+----+------+------+--------+---------+---------+-------------+------+-----------+-------+------+
| id | cep | site | cidade | uf | cepbase | segmentacao | area | cepstatus | score | fila |
+----+------+------+--------+---------+---------+-------------+------+-----------+-------+------+
| 1 | 1 | GOD | 24655 | 3554805 | - | SP | BING | 0 | 99999 | 5 |
+----+------+------+--------+---------+---------+-------------+------+-----------+-------+------+
1 row in set (0.01 sec)
Why values from CSV are not being filled correctly?
According to this specification the column list after IGNORE 1 ROWS decides how the columns of the CSV file are mapped to columns of the table. It can either list the table columns in the order of the file or it can load the file columns into variables. With the column list
(#id,#cep,#site,#cidade,#uf,#cepbase,#segmentacao,#area,#cepstatus,#score,#fila)
you are loading 11 columns of the CSV file into variables named "id", "cep", etc. In the SET statement you then need to declare how the columns of the table are constructed from the variables. With the given statement you are refering to variables #col1 etc. that are not defined anywhere and consequently have undefined values.
The corrected statement (that I sadly can't test myself right now) should be:
INTO TABLE cep_status_new
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS
(#col1,#col2,#col3,#col4,#col5,#col6,#col7,#col8,#col9)
SET id=NULL, cep=#col1, site='GOD', cidade=#col6, uf=#col7, cepbase='-', segmentacao=#col9, cepstatus=#col2, area='BING', score=99999, fila=5;

Add rows into a mysql table from a .sql file

I have a question on How I can insert a .sql file into a MySQL table which already contains lot of data ?
My .sql file looks like (1200 rows) :
--
-- Descriptif plan comptable SYSCOHADA (utf-8)
--
INSERT INTO llx_accounting_system (rowid, pcg_version, fk_pays, label, active) VALUES (10,'SYSCOHADA', 49, 'Plan comptable Ouest-Africain', 1);
INSERT INTO llx_accounting_account (rowid, fk_pcg_version, pcg_type, pcg_subtype, account_number, account_parent, label, active) VALUES (15000,'SYSCOHADA','CAPITAUX','XXXXXX','1',0,"Capital",'1');
INSERT INTO llx_accounting_account (rowid, fk_pcg_version, pcg_type, pcg_subtype, account_number, account_parent, label, active) VALUES (15001,'SYSCOHADA','CAPITAUX','XXXXXX','101',15000,"Capital social",'1');
INSERT INTO llx_accounting_account (rowid, fk_pcg_version, pcg_type, pcg_subtype, account_number, account_parent, label, active) VALUES (15002,'SYSCOHADA','CAPITAUX','XXXXXX','1011',15001,"Capital souscrit, non appele);
My MySQL table looks like :
mysql> describe llx_accounting_account ;
+----------------+--------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+----------------+--------------+------+-----+-------------------+-----------------------------+
| rowid | int(11) | NO | PRI | NULL | auto_increment |
| entity | int(11) | NO | | 1 | |
| datec | datetime | YES | | NULL | |
| tms | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| fk_pcg_version | varchar(32) | NO | MUL | NULL | |
| pcg_type | varchar(20) | NO | | NULL | |
| pcg_subtype | varchar(20) | NO | | NULL | |
| account_number | varchar(32) | NO | MUL | NULL | |
| account_parent | varchar(32) | YES | | NULL | |
| label | varchar(255) | NO | | NULL | |
| fk_user_author | int(11) | YES | | NULL | |
| fk_user_modif | int(11) | YES | | NULL | |
| active | tinyint(4) | NO | | 1 | |
+----------------+--------------+------+-----+-------------------+-----------------------------+
13 rows in set (0.00 sec)
My MySQL table is not empty. There is already data and I want to add my .sql file after my data table.
I didn't execute this command because I think it's false :
LOAD DATA LOCAL INFILE 'data_3.9.sql'
INTO TABLE llx_accounting_account
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\n'
Do you have the solution ?
Thank you :)
----------------------------------------------------------------------------------
Solution :
With comments by #RakeshKumar and #PaulF, I found a way to solve my problem :
1) I deleted all rows where fk_pcg_version = 'SYSCOHADA':
delete from llx_accounting_account where fk_pcg_version = 'SYSCOHADA' ;
2) I imported the .sql file :
mysql -u root -p****** dolibarr < data_3.9.sql
3) I modified one information because account_number was 1 instead of 10 where rowid = 15000 :
UPDATE llx_accounting_account SET account_number = 10 WHERE rowid=15000 ;
Seems good :)
Thank you ;)
Use following way to import file
mysql -u username -p'password' dbname < filename.sql
Your import didn't work because you had already your same old SYSCOHADA rows in your table.
You can delete all rows where fk_pcg_version = 'SYSCOHADA' and import again your file corrected.
I had the same error importing data from a SQL file that was created from a mysqldump table output to file command. The reason was that the SQL file has the DROP TABLE IF EXISTS and CREATE TABLE statement at the start of the file.
Once this is removed, then it effectively appends the rows to the existing records.
Maybe this will help others.

mySQL Command Line Import CSV Gives me NULL

I am used to using PHPmyadmin to manage my mySQL databases but I am starting to use the command line a lot more. I am trying to import a CSV file into a table called source_data that looks like this...
+-----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| code | varchar(10) | YES | | NULL | |
| result | char(1) | YES | | NULL | |
| source | char(1) | YES | | NULL | |
| timestamp | varchar(30) | YES | | NULL | |
+-----------+-------------+------+-----+---------+----------------+
And my CSV file looks like this...
code,result,source,timestamp
123 ABC,,,
456 DEF,,,
789 GHI,,,
234 JKL,,,
567 MNO,,,
890 PQR,,,
I am using this command..
LOAD DATA INFILE '/home/user1/data.csv' INTO TABLE source_data FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS;
This inserts the correct number of rows but each one just says NULL, where am I going wrong?
Since the CSV file doesn't have all the table columns (it's missing the id column), you need to specify the columns that they should be written into explicitly.
LOAD DATA INFILE '/home/user1/data.csv'
INTO TABLE source_data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(code, result, source, timestamp);
Well, I hope this wouldn't be a case still, You said table name is source_data and in command the name is data

MySQL CSV Import Fails - "Data too long for column 'air_date' at row 1"

I'm trying to import a CSV file into a MySQL table and I'm having all kinds of trouble getting it to work. Here's what I'm trying to do:
I am working on a video database and have an existing table with data already in it called episodes. Here's how it's set up:
+--------------+-----------------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+--------------+-----------------------+------+-----+-------------------+-----------------------------+
| title | varchar(40) | NO | MUL | NULL | |
| media_id | varchar(11) | NO | | NULL | |
| ep_info | varchar(75) | YES | | NULL | |
| air_date | varchar(20) | NO | | NULL | |
| trt | varchar(8) | NO | | NULL | |
| times_played | mediumint(9) unsigned | NO | | 0 | |
| last_played | timestamp | YES | | NULL | |
| entered | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| id | int(10) unsigned | NO | PRI | NULL | auto_increment |
| ep_desc | varchar(300) | NO | | NULL | |
+--------------+-----------------------+------+-----+-------------------+-----------------------------+
The primary key is the id field, with the title field set as a foreign key to the shows table. The shows table looks like this:
+-------------+-------------+------+-----+------------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------------+-------------+------+-----+------------+-------+
| title | varchar(50) | NO | PRI | NULL | |
| title_image | varchar(50) | NO | | NULL | |
| gif_image | varchar(50) | NO | | NULL | |
| info_url | varchar(30) | NO | | shows.html | |
+-------------+-------------+------+-----+------------+-------+
My CSV file is in the following format:
"Big Wolf On Campus","BWOC0102","Season 1 Episode 2: The Bookmobile","April 9, 1999";"21:57",NULL,NULL,NULL,NULL,"Once every 70 years, a window of transference opens that offers Tommy a chance to pass his curse to another person. Merton volunteers but that same day a bookmobile shows up in Pleasantville and people start disappearing."
"Big Wolf On Campus","BWOC0103","Season 1 Episode 3: Butch Comes To Shove","April 16, 1999","21:06",NULL,NULL,NULL,NULL,"When a character from a 1950s educational film gets sick of the rules he decides to leave the movie for Pleasantville. While there Butch decides to find someone to bring back to his black-and-white world - and Stacey is at the top of his list."
During the import, I want the data in the CSV added to the existing data in the table. I also want the last_played field set to NULL (only updated when the show plays), the entered field set with a current timestamp, and the id field auto_incremented with the next value for the table.
Here is my import statement:
LOAD DATA INFILE 'ytv.csv' INTO TABLE episodes
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
The resulting error message:
ERROR 1406 (22001): Data too long for column 'air_date' at row 1
What am I doing wrong here? It seems like the data is getting shifted over one column when it's importing (such that ep_info from the CSV is going into the air_date column) but I can't figure out why. Any insight would be much appreciated for this MySQL novice.
It seems you have some new episodes with no mapping entry in shows table. You can create a new table like episodes, remove any constraints, load the data to the new table, insert all missing show titles to your show table, then insert episodes from the new table to the episodes table.
Or you can delete the foreign key, load the data, amend you shows table, then add the foreign key back.