Importing CSV files into MySql table with one column as name of CSV files - mysql

I have 35 CSV files which I want to import to MYSQL table(say 'test'). I want to create one column in 'test' table( say 'file_name'). This column will contain name of the CSV from which data has been imported. The file names are unique IDs, that is why I want to get file name as input in the table.
Suppose I have CSV files like X1.csv, X2.CSV, X3.csv .... X35.csv. I want a column in 'test' table as 'file_name' such that 'test' table looks something like:
col1 -> a, b, c, d
col2 -> x, y, w, z
...
...
... ....
file_name -> X1, X1, X2, X3
Note: I tried to search this question on forum but I could not find any suitable solution. Also I am new to MYSQL, please help even it is a trivial thing.

I'm not sure this is exactly what you are looking for, but at first sight, you should investigate the LOAD DATA INFILE statement:
LOAD DATA INFILE 'X1.csv' INTO TABLE tbl_name -- Load the content of the CSV file
FIELDS TERMINATED BY ',' ENCLOSED BY '"' -- assuming fields separate by ",", enclosed by "'"
LINES TERMINATED BY '\r\n' -- assuming end-of-line being '\r\n'
IGNORE 1 LINES -- assuming first line is a header and should be ignored
SET file_name = 'X1'; -- force the column `file_name` to be the name of the file
Please note that with such statement, each field will go in its own column of the table. And each line of the CSV data file will be loaded a one row in the table. This will imply that there will be several rows in the result table with the same file name. In fact one row per data line.

Related

Multiple input file loading to single mysql table

I am kind of new here. Have been searching for 2 days and no luck so I am posting my question here. Simply put I need to load data into a table in mysql. Now the thing is the input data for this table will be coming from two different source.
For eg: below is the how the 2 input files will be.
Input_file1
Field Cust_ID1, Acct_ID1, MODIFIED, Cust_Name, Acct_name, Comp_name1, Add2, Comp_name2, Add4
Sample value C1001, A1001, XXXXXX, JACK, TIM KFC, SINGAPORE, YUM BRAND, SINGAPORE
Input_file2
Field ID, MODIFIEDBY, Ref_id, Sys_id
Sample value 3001, TONY, 4001, 5001
Sorry was not able to copy data as in excel so improvised. The ',' is to show separate values. Field specifies the column name and its corresponding value is under sample value.
And the table that the above data needs to be loaded into is as such
Sample _table_structure
ID
Cust_ID1
Acct_ID1
Ref_id
Sys_id
MODIFIED
MODIFIEDBY
Cust_Name
Acct_name
Comp_name1
Add2
Comp_name2
Add4
What I need to do is load data into this table from the input data that comes to me in one single go. Is this possible. As you can see the order is also not a match that I can append and load it. Which is one main issue for me.
And no, changing the input sequence is not a option. Data is huge so that will take too much effort. Any help with this I would appreciate. Also I would like to know if we could use a shell or perl script to do this.
Thanks in advance for the help & time.
load data local infile 'c:\\temp\\file.csv'
into table table_name fields terminated by ',' LINES TERMINATED BY '\r\n' ignore 1 lines
(#col1,#col2,#col3,#col4,#col5,#col6,#col7,#col8,#col9)
set Cust_ID1 = #col1,
Acct_ID1 = #col2,
MODIFIED =#col3,
Cust_Name =#col4....;
load data local infile 'c:\\temp\\file2.csv'
into table table_name fields terminated by ',' LINES TERMINATED BY '\r\n' ignore 1 lines
(#col1,#col2,#col3,#col4 ) ## here Number the respective columns as per the table
set ID = #col1,
MODIFIEDBY = #col2,
REF_ID = #col3,
sys_ID = #col4....
ID, MODIFIEDBY, Ref_id, Sys_id
same thing for csv file 2.
this way you can import file to table.
Note :
Please save Excel file as csv format and then import

Import CSV into MySQL - Offset by 1 Column

I am importing my data from a csv file like this:
LOAD DATA LOCAL INFILE "c:\myfile.csv" INTO TABLE historic FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
This ALMOST works perfect. My table has one different row, the first: An auto incremented id column. Otherwise the csv file and the MySQL table match perfectly. When I run the command above, it puts the first column from the csv into the id field. It want it to actually go into the second.
Is there a way I can modify the above statement to either specify the columns or just offset it by 1? (skipping the first column on import).
You can optionally name the columns to be populated by LOAD DATA, and simply omit your id column:
LOAD DATA LOCAL INFILE "c:\myfile.csv" INTO TABLE historic
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'
(column2, colum3, column4, column5, ...)
You can load your data specifing the order columns that you're going to use into your table:
LOAD DATA LOCAL FILE '/tmp/test.txt' INTO TABLE historic
(name_field1, name_field2, name_field3...);
Or if you prefer you can load it first to a temporary table and use a select into statement to laod it into your final table (it's slower).

leading zeros in mysql

I have a form where user updates the table in database with serial numbers, the problem is that in my .csv file serial number has value 0 and after inserting it, it has 000000, same for the 1, after inserting it is 000001. I need it in exact way like it is in .csv file. My code for the LOAD is:
LOAD DATA LOCAL INFILE path_to_file.csv
INTO TABLE im_seriennummer CHARACTER SET latin1
FIELDS TERMINATED BY ";"
IGNORE 1 LINES
(sn,description_sn)
In .csv file it is like this:
0
1
And in database
000000
000001
In the database sn is varchar(16).
Is this problem familiar to anyone? Please don't tell me to change the type of field, I need to have it in varchar since some serial numbers are like this MT 002
The solution,i think, is to use a temp table from import the csv.
CREATE TEMPORARY TABLE tmptab LIKE im_seriennummer;
LOAD DATA LOCAL INFILE path_to_file.csv
INTO TABLE tmptab CHARACTER SET latin1
FIELDS TERMINATED BY ";"
IGNORE 1 LINES
(sn,description_sn)
UPDATE tmptab SET SERIAL = RIGHT(CONCAT('000000', SERIAL), 6)
INSERT INTO im_seriennummer
SELECT * FROM tmptab
DROP TEMPORARY TABLE tmptab;

Loading a CSV file in a table using sqlloader

I have CSV file having two columns id_a and id_b, but I need to insert 4 more columns; ie. emp_sal_a, emp_sal_b, emp_dept_a, emp_dept_b using sqlldr. So my current control file looks like:
load data
infile '/home/.../employee.txt'
into table employee
fields terminated by ","
( id_a, id_b,
emp_sal_a ":id_a+1000", emp_sal_b "id_b+1000", emp_dept_a "10", emp_dept_b "20")
But I am getting error:
invalid binding variables
From MySQL Load Data Ref
note: search for the "(" character and it's the 35th instance of it on the page
User variables in the SET clause can be used in several ways. The following example uses the first input column directly for the value of t1.column1, and assigns the second input column to a user variable that is subjected to a division operation before being used for the value of t1.column2:
LOAD DATA INFILE 'file.txt'
INTO TABLE t1
(column1, #var1)
SET column2 = #var1/100;
#var1 is the name of a variable you want to run an operation on, and what you're doing is calling SET on column2 to be equal to #var1/100.

Import CSV to MySQL With Fewer Columns Than Destination Table

Is there a way to import a CSV into MySQL where the column numbers do not match?
I have a MySQL "Employees" table which we update once a month with a CSV file. I have added two additional columns to the "Employees" table that are relevant to our needs. Importing the table data, through MySQL Workbench, with the following SQL statement:
LOAD DATA LOCAL INFILE 'EmployeeDump-March.csv' REPLACE INTO TABLE employees
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
I receive the following error, which is absolutely correct... the column numbers don't match:
Error Code: 1261. Row 1 doesn't contain data for all columns 7.426 sec
The two extra fields have default values.
Is there a way to supress this error and have MySQL ignore those last two columns, leaving them alone?
You can (and should!) explicitly name the columns you are inserting into using LOAD DATA INFILE, just like a regular INSERT. You can also optionally set default values for the two new columns you added as part of the LOAD DATA INFILE statement.
Something like this. Just fill in the actual column names in the right order between the parens, and set the actual defaults and column names for the new columns if you don't want NULL:
LOAD DATA LOCAL INFILE 'EmployeeDump-March.csv' REPLACE INTO TABLE employees
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(old_column1, old_column2, ...)
set new_column1 = NULL,
new_column2 = NULL