phpMyAdmin - manually import csv and skip columns - mysql

In phpMyAdmin, in the tab import, how to specify which columns of the csv must be skipped by the import?
For example, I have this csv:
col1 col2 col3 col4 col5
a b c x 0
1 2 3 y 1
I need to manually import that csv skipping the 4th column.
With the Csv using LOAD DATA I can specify the Column names; in that field I tried to insert these but without working:
col1, col2, col3, #dummy, col5
Invalid column (#dummy) specified! Ensure that columns names are spelled correctly, separated by commas, and not enclosed in quotes.
and
col1, col2, col3, , col5
SQL query:
LOAD DATA LOCAL INFILE '/tmp/phpBlaBlaBla'
INTO TABLE `tblName` FIELDS TERMINATED BY ';' ENCLOSED BY '"' ESCAPED BY '\' LINES TERMINATED BY '\n' IGNORE 1 LINES
(`col1` , `col2` , `col3` , , `col5`)
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near `col5`)' at line 1
The phpMyAdmin version is 4.0.10.16.
Thanks in advance!

Related

Warning 1261 Row n doesn't contain data for all columns

I have a requirement to load data from a CSV file into a Mysql Table in Windows 10.
The command I am using in MySQL Workbench 8.0.20 is:
LOAD DATA INFILE "C:/ProgramData/MySQL/MySQL Server 5.7/Uploads/Sample.txt" IGNORE INTO TABLE TableA
FIELDS OPTIONALLY ENCLOSED BY '"'
TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(col1, col2, col3, col4, col5, col6, col7, col8,
col9, col10, col11, col12, col13, col14, col15,
dummy_1, dummy_2, dummy_3, dummy_4, dummy_5) ;
The original input file was Sample.csv.
When I ran the LOAD cmd I got the error
Error 1261 Row 1 doesn't contain data for all columns.
Similar error for Rows 2 - 13.
I saved the CSV file as a Text file ( Sample.txt ) using Notepad. I got the same errors for the text file. It is only giving errors for the first 13 rows in the text file. So I deleted the first 13 rows and ran the LOAD again. I still get the same warnings for rows 1 to 13 only.
FYI, Each row does not have the same number of Fields.
Any help in resolving this would be appreciated.
The error message is pretty clear and just says that some of the lines in your CSV text file do not have 20 data points. You may try using the following regex pattern to highlight rows in the file which do not match:
^(?!(?:[^,]*,){19}[^,]*$).*$
Demo

sqlldr does not recognize empty tab delimited columns

I am seeing a strange problem when loading my data with sqlldr. Here is my table schema:
CREATE TABLE TEST(
"COL1" VARCHAR2 (255 BYTE),
"COL2" VARCHAR2 (255 BYTE),
"COL3" NUMBER,
"COL4" VARCHAR2 (255 BYTE)
and here is just one row of data I am trying to ingest from the tab delimited file test.txt:
COL1 COL2 COL3 COL4
10 17-cc
notice that the first two columns are empty (null). So my row is really:
\t\t10\t17-cc
my loader script:
load data
infile 'test.txt'
append into table TEST
fields terminated by "\t" optionally enclosed by '"'
TRAILING NULLCOLS
(COL1,COL2,COL3,COL4)
This will be loaded into my table as:
COL1 COL2 COL3 COL4
10 17-CC (null) (null)
which is incorrect. it seems that the two leading tabs in the data row were ignored and COL3 position (10) was assigned to COL1. However, if I try to import the data as a comma separated file:
COL1,COL2,COL3,COL4
,,10,17-cc
it works as expected. Why does the tab delimited version fails here?
NOTE - Fixed my original wrong answer.
Your TAB is defined just fine. You need the NULLIF statement:
load data
infile 'test.txt'
append into table TEST
fields terminated by "\t" optionally enclosed by '"'
TRAILING NULLCOLS
(COL1 NULLIF(COL1=BLANKS),
COL2 NULLIF(COL2=BLANKS),
COL3 NULLIF(COL3=BLANKS),
COL4 NULLIF(COL4=BLANKS)
)

Mysql Load File Error loading inaccurate data

Hi I have a csv dataset like
ukwn,34,2018-10-01,"757,271"
ukwn,3,2018-10-01,"7,342"
"hi",23,2018-10-01,"3,483,887"
i want to insert it in the database so i made the code:
LOAD DATA LOCAL INFILE 'data.csv' INTO TABLE app_spend_metric
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(col1,col2,col3,col4)
But i fails to insert the col4 (4th row) as there is ',' inside a "" like "7,345"
I tried then,
LOAD DATA LOCAL INFILE 'data.csv' INTO TABLE app_spend_metric
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(col1,col2,col3,col4)
But this time it enters the partial data in col4 like out of "7,344" it only enters "7
If col4 is numeric (eg, INT), then the problem is as follows:
Parse the line to get the string "7,344"
Strip the enclosing ": 7,344
Store the string into the INT column. This requires converting that string to a number.
Conversion stops at the first non-numeric character, namely the comma.
Result: The col4 is set to 7, and ,344 is tossed.
MySQL cannot deal with "thousands-separators" in numbers. But you could strip them:
LOAD ...
(col1, col2, col3, #num)
SET col4 = REPLACE(#num, ',', '')

Mysql Load Data: SET IF for all or given subset of columns?

Got working import code for csv files into my 8-column database here:
Load Data LOCAL InFile 'file.csv' into table myTable
Fields Terminated by ','
OPTIONALLY Enclosed by '"'
Lines Terminated by '\n'
IGNORE 1 Lines
(col1, col2, #var3, #var4, col5, col6, col7, col8)
Set
col3 = if(#var3 = '', NULL, #var3),
col4 = if(#var4 = '', NULL, #var4)
;
It's working fine in changing empty entries to NULL values, but....
Is there any way to shorten the Set part so I don't have to specify a condition for each and every column?
I actually need this for 7 of the 8 columns above, and this particular table is rather small.
Is there any way to shorten the Set part
Yes, MySQL provides a shorthand function NULLIF():
col3 = NULLIF(#var3, '') -- etc
so I don't have to specify a condition for each and every column?
Sadly not, although it should be fairly trivial to generate the desired SQL dynamically in your application code.

How to skip columns in CSV file when importing into MySQL table using LOAD DATA INFILE?

I've got a CSV file with 11 columns and I have a MySQL table with 9 columns.
The CSV file looks like:
col1, col2, col3, col4, col5, col6, col7, col8, col9, col10, col11
and the MySQL table looks like:
col1, col2, col3, col4, col5, col6, col7, col8, col9
I need to map the columns 1-8 of CSV file directly to the first 8 columns of the MySQL table. I then need to skip the next two columns in the CSV file and then map column 11 of CSV file to column 9 of MySQL table.
At the moment I am using the following SQL command:
LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE my_table
FIELDS TERMINATED BY ','
ENCLOSED BY ''
LINES TERMINATED BY '\n'
But the above code maps the first 9 columns of CSV file to the 9 columns in the MySQL table.
From Mysql docs:
You can also discard an input value by
assigning it to a user variable and
not assigning the variable to a table
column:
LOAD DATA INFILE 'file.txt'
INTO TABLE t1 (column1, #dummy, column2, #dummy, column3);
step1.deal with awk.
cat file.txt |awk '{print $1,$2,$5...}'>new_file.txt
step2.load into mysql.
load data local infile 'new_file' into table t1(...)
the method below is simple,but not allowed in lower version of mysql.
LOAD DATA INFILE 'file.txt'
INTO TABLE t1 (column1, #dummy, column2, #dummy, column3);
#deemi:
The only way to ignore the #dummy is by setting the field's Default to AUTO INCREMENT.
So you can skip the field and just code like this,
LOAD DATA INFILE 'file.txt'
INTO TABLE t1 (column2, column3, column4, column5);
//assumes that the fieldname column1 is set to AUTO INCREMENT by default.
I think there is one more change in the code:
The following SQL command:
LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE my_table
FIELDS TERMINATED BY ','
ENCLOSED BY ''
LINES TERMINATED BY '\n'
-will probably result in an data truncation error.
So it is better to use LINES TERMINATED BY '\r\n' instead of LINES TERMINATED BY '\n'
SO the code will be:
LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE my_table
FIELDS TERMINATED BY ','
ENCLOSED BY ''
LINES TERMINATED BY '\r\n'