Good Day
I have created a bat file to import a text file to my MySQL database and it looks as follows:
sqlcmd /user root /pass password /db "MyDB" /command "LOAD DATA LOCAL INFILE 'file.csv' INTO TABLE TG_Orders FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'"
My problem is that I cannot get the "Treat consecutive delimiters as one" to work...
How would I add that?
Now that we have actually got to the real crux of the problem, this is not a consecutive delimiter problem - it's a CSV file format problem.
If your CSV file contains fields like B121,535 and they are not enclosed within quote marks of some kind and your delimeter is , then no amount of SQL jiggery-pokery will sort out your problem. Un-quoted fields with commas like this will always be interpreted as two separate fields unless enclosed within quote marks.
Post a sample line from the CSV file which is causing problems and we can diagnose further. Failing that, export the data from the initial system again making sure that the formatting is correct (either enclose everything in speech marks or just string fields)
Finally, are you sure that your database is MySQL based and not Microsoft SQL? The only references to SQLCMD.EXE I can find all point to Microsoft sites in relation to SQL Server Express but, even then, it has a different option structure (-U for user rather than /user). If this is the case you could have saved a lot of hassle by putting the correct information tags. If not then I would say that SQLCMD.EXE is a custom written application from somewhere and the problem could all stem from that. If that is the case then we can't help if the CSV formatting is correct - you're on your own
Related
For days I am trying to export from my SQL-server table and to import into MySQL-table.
I can't solve the problem with HTML-Mails in one field of the table, which contains everything the HTML-code can have, such as \r\n linebreaks, quotation marks, maybe even | pipe-sign.
I tried exporting a concatenated string from SQL such as 'Insert Into MYSQL_table (field1, field2, ...)
I tried CSV-Files with terminal.command
LOAD DATA LOCAL INFILE 'G:/Test2.csv'
INTO TABLE insectum.tblolnachrichten
CHARACTER SET utf8mb4
FIELDS TERMINATED BY '|##|'
ENCLOSED BY ''
ESCAPED BY '\n'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
tried workbench, php with CSV-files, I think everything.
But everywhere I fail due to another occurence of any sign in the HTML-Code in this field.
There are about 5000 lines to be transfered intyo Mysql-table, more than 100 MB in CSV-File.
I even tried field separator like |##| .
The content of this one field is wrapped with like this:
|##|myHTML-field|##|
Did not work as well.
Any idea what I could do to tell Mysql at import to keep content of a field for import and do not make a break anwhere?
Well, as no one had an answer for me, I did it the boring but obviously easierst way:
I linked SQL and MySQL into empty MS Access database and copied from one to another by taking about 300 rows every copy.
It worked and as I just have to do ONE time, it is OK.
I am having essentially the same problem as described here but the issue was left unresolved in that question.
I am trying to import a series of data files totaling about 100 million records into a MariaDB database. I've run into issues with some lines in the import file that look like:
"GAYATRI INC DBA "WHIPIN"","1950","S I","","AUSTIN","TX","78704","5124425337","B","93"
which I was trying to load with a statement like:
LOAD DATA INFILE 'testline.txt'
INTO TABLE data
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
(#name,#housenum,#street,#aptnum,#city,#state,#zip,#phone,#business,#year)
SET name=#name, housenum=#housenum, street=#street, aptnum=#aptnum, city=#city, state=#state, zip=#zip, phone=#phone, business=#business, year=#year;
but am receiving errors because the first field contains unescaped double quotes in the text of the field. That seems to be OK in and of itself as the database seems smart enough to handle that in most situations. However, because the field ends with a double quote in the text plus a double quote to close the field it assumes the first double quote is escaping the second double quote following RFC4180 and thus is not terminating the field even though the next character is a comma.
The source files can't be created any differently as they are exports from old software which I do not control. Obviously searching through 100 million records and changing entries like this by hand is not feasible. I'm unsure of whether any fields might contain commas though it's probably safe to assume they do in this quantity of records so programmatically forcing fields to break at commas is probably out too.
Any ideas on how to get them to import correctly?
How can I load 10,000 rows of test.xls file into mysql db table?
When I use below query it shows this error.
LOAD DATA INFILE 'd:/test.xls' INTO TABLE karmaasolutions.tbl_candidatedetail (candidate_firstname,candidate_lastname);
My primary key is candidateid and has below properties.
The test.xls contains data like below.
I have added rows starting from candidateid 61 because upto 60 there are already candidates in table.
please suggest the solutions.
Export your Excel spreadsheet to CSV format.
Import the CSV file into mysql using a similar command to the one you are currently trying:
LOAD DATA INFILE 'd:/test.csv'
INTO TABLE karmaasolutions.tbl_candidatedetail
(candidate_firstname,candidate_lastname);
To import data from Excel (or any other program that can produce a text file) is very simple using the LOAD DATA command from the MySQL Command prompt.
Save your Excel data as a csv file (In Excel 2007 using Save As) Check
the saved file using a text editor such as Notepad to see what it
actually looks like, i.e. what delimiter was used etc. Start the MySQL
Command Prompt (I’m lazy so I usually do this from the MySQL Query
Browser – Tools – MySQL Command Line Client to avoid having to enter
username and password etc.) Enter this command: LOAD DATA LOCAL INFILE
‘C:\temp\yourfile.csv’ INTO TABLE database.table FIELDS TERMINATED
BY ‘;’ ENCLOSED BY ‘”‘ LINES TERMINATED BY ‘\r\n’ (field1, field2);
[Edit: Make sure to check your single quotes (') and double quotes (")
if you copy and paste this code - it seems WordPress is changing them
into some similar but different characters] Done! Very quick and
simple once you know it :)
Some notes from my own import – may not apply to you if you run a different language version, MySQL version, Excel version etc…
TERMINATED BY – this is why I included step 2. I thought a csv would default to comma separated but at least in my case semicolon was the deafult
ENCLOSED BY – my data was not enclosed by anything so I left this as empty string ”
LINES TERMINATED BY – at first I tried with only ‘\n’ but had to add the ‘\r’ to get rid of a carriage return character being imported into the database
Also make sure that if you do not import into the primary key field/column that it has auto increment on, otherwhise only the first row will be imported
Original Author reference
I'm trying to import a large csv file wiht 27797 rows into MySQL. Here is my code:
load data local infile 'foo.csv' into table bar fields terminated by ',' enclosed by '"' lines terminated by '\n' ignore 1 lines;
It works fine. However, some rows of this file containing backslashes (\), for example:
"40395383771234304","40393156566585344","84996340","","","2011-02-23 12:59:44 +0000","引力波宇宙广播系统零号控制站","#woiu 太好了"
"40395151830421504","40392270645563392","23063222","","","2011-02-23 12:58:49 +0000","引力波宇宙广播系统零号控制站","#wx0 确切地讲安全电压是\""不高于36V\""而不是\""36V\"", 呵呵. 话说要如何才能测它的电压呢?"
"40391869477158912","40390512645124096","23063222","","","2011-02-23 12:45:46 +0000","引力波宇宙广播系统零号控制站","#wx0 这是别人的测量结果, 我没验证过. 不过麻麻的感觉的确是存在的, 而且用适配器充电时麻感比用电脑的前置USB接口充电高"
"15637769883","15637418359","35192559","","","2010-06-07 15:44:15 +0000","强互作用力宇宙探测器","#Hc95 那就不是DOS程序啦,只是个命令行程序,就像Android里的adb.exe。$ adb push d:\hc95.tar.gz /tmp/ $ adb pull /system/hc95/eyes d:\re\"
After importing, lines with backslashes will be broken.
How could I fix it? Should I use sed or awk to substitute all \ with \ (within 27797 rows...)? Or this can be fixed by just modifying the SQL query?
This is abit more of a discussion than a direct answer. Do you need the double quotes in the middle of the values in the final data (in the DB)? The fact that you have a large amount of data to munge doesn't present any problems at all.
The "" thing is what Oracle does for quotes inside strings. I think whatever built that file attempted to escape the quote sequence. This is the string manual for MySQL. Either of these is valid::
select "hel""lo", "\"hello";
I would tend to do the editing separately to the import, so it easier/faster to see if things worked. If your text file is less than 10MB, it shouldn't take more than a minute to update it via sed.
sed -e 's/\\//' foo.csv
From your comments, you can set the escape char to be something other than '\'.
ESCAPED BY 'char'
This means the loader should verbatim add the values. If it gets too complicated, if you base64() the data before you insert it, this will stop any tools from breaking the UTf8 sequences.
What I did in a similar situation was to create a java string first in a test application. Then compile the test class and fix any errors that I found.
For example:
`String me= "LOAD DATA LOCAL INFILE 'X:/access.log/' REPLACE INTO TABLE `logrecords"+"`\n"+
"FIELDS TERMINATED BY \'|\'\n"+
"ENCLOSED BY \'\"\'\n"+
"ESCAPED BY \'\\\\\'\n"+
"LINES TERMINATED BY \'\\r\\n\'(\n"+
"`startDate` ,\n"+
"`IP` ,\n"+
"`request` ,\n"+
"`threshold` ,\n"+
"`useragent`\n"+
")";
System.out.println("" +me);
enter code here
I have an excel file that i need to get into CSV. I export it fine but when I go to import it into a mysql db via phpMyAdmin i get a "Invalid field count in CSV input on line 1.".
Problem seems to be that the fields are not enclosed by double quotes. I just migrated to MS Excel 2007 and am not sure how to manipulate the CSV save options so that there are double quotes around the fields so my DB doesn't throw a conniption when i try to import.
Any suggestions? I'm fairly new at going from EXCEL to CSV but have gotten it to work previously.
Thanks
This worked for me after exporting from Excel as CSV and defining various options
load data infile '/tmp/tc_t.csv'
into table new_test_categories
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
ignore 1 lines
(id,category_name,type_id,home_collection,seo_tags,status_id);
I ran this at the mysql prompt.
There should be an MS-DOS format of CSV in your export drop down. Pick that one.
There should be an option in save-as advanced properties or something, but if not, you could always change the delimiter character to : or ; or | and then write a quick perl script to convert it to a quote-and-comma file.
Or you could just try a tab-separated-value file instead, I think phpMyAdmin will read TSVs as well.