Migration from SQLServer to MySQL-Server DB with HTML code - mysql

For days I am trying to export from my SQL-server table and to import into MySQL-table.
I can't solve the problem with HTML-Mails in one field of the table, which contains everything the HTML-code can have, such as \r\n linebreaks, quotation marks, maybe even | pipe-sign.
I tried exporting a concatenated string from SQL such as 'Insert Into MYSQL_table (field1, field2, ...)
I tried CSV-Files with terminal.command
LOAD DATA LOCAL INFILE 'G:/Test2.csv'
INTO TABLE insectum.tblolnachrichten
CHARACTER SET utf8mb4
FIELDS TERMINATED BY '|##|'
ENCLOSED BY ''
ESCAPED BY '\n'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
tried workbench, php with CSV-files, I think everything.
But everywhere I fail due to another occurence of any sign in the HTML-Code in this field.
There are about 5000 lines to be transfered intyo Mysql-table, more than 100 MB in CSV-File.
I even tried field separator like |##| .
The content of this one field is wrapped with like this:
|##|myHTML-field|##|
Did not work as well.
Any idea what I could do to tell Mysql at import to keep content of a field for import and do not make a break anwhere?

Well, as no one had an answer for me, I did it the boring but obviously easierst way:
I linked SQL and MySQL into empty MS Access database and copied from one to another by taking about 300 rows every copy.
It worked and as I just have to do ONE time, it is OK.

Related

Has anyone ever encountered additional unknown characters appending to your column entries after importing from a .csv file?

I've filled my table of instruments using LOAD INTO FILE. It fills the rows successfully but then doesn't enclose the final column (status) with a vertical line. I didn't think this was an issue until I ran a query to check the number of column entries = "commissioning".
SELECT COUNT(*)
FROM instrument
WHERE status = 'commissioning';
All 60 rows contain "commissioning" so it should return 60, but instead it returns 0?
I retried the query with a wildcard search and returned the right result here (You can also see the table is not enclosed)
Perhaps something is going on when I imported from csv file, because a LENGTH(status) query returns 14 when "commissioning" is only 13 characters. Has anyone encountered this before or know what character could be causing this?
Heres the import from the csv file code for further clarity - but it worked fine with my other tables
The problem you are having is produced because Windows uses '\r\n' instead of '\n'. As you are telling the import statement to finish lines with '\n' you have an extra '\r' character in every line. You need to change your import statement as:
LOAD DATA INFILE 'instruments.csv'
INTO TABLE instruments
FILEDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS;

Error loading MySQL/MariaDB Inline Data With Unescaped Double Quotes In Fields

I am having essentially the same problem as described here but the issue was left unresolved in that question.
I am trying to import a series of data files totaling about 100 million records into a MariaDB database. I've run into issues with some lines in the import file that look like:
"GAYATRI INC DBA "WHIPIN"","1950","S I","","AUSTIN","TX","78704","5124425337","B","93"
which I was trying to load with a statement like:
LOAD DATA INFILE 'testline.txt'
INTO TABLE data
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
(#name,#housenum,#street,#aptnum,#city,#state,#zip,#phone,#business,#year)
SET name=#name, housenum=#housenum, street=#street, aptnum=#aptnum, city=#city, state=#state, zip=#zip, phone=#phone, business=#business, year=#year;
but am receiving errors because the first field contains unescaped double quotes in the text of the field. That seems to be OK in and of itself as the database seems smart enough to handle that in most situations. However, because the field ends with a double quote in the text plus a double quote to close the field it assumes the first double quote is escaping the second double quote following RFC4180 and thus is not terminating the field even though the next character is a comma.
The source files can't be created any differently as they are exports from old software which I do not control. Obviously searching through 100 million records and changing entries like this by hand is not feasible. I'm unsure of whether any fields might contain commas though it's probably safe to assume they do in this quantity of records so programmatically forcing fields to break at commas is probably out too.
Any ideas on how to get them to import correctly?

Row does not contain data for all columns

Im trying to import a text file containing:
http://pastebin.com/qhzrq3M7
Into my database using the command
Load data local infile 'C:/Users/Gary/Desktop/XML/jobs.txt'
INTO Table jobs
fields terminated by '\t';
But I keep getting the error Row 1-13 doesn't contain data for all columns
Make sure the last field of each row ends with \t. Alternatively, use LINES TERMINATED BY
LOAD DATA LOCAL INFILE 'C:/Users/Gary/Desktop/XML/jobs.txt' INTO TABLE jobs COLUMNS TERMINATED BY '\t' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r';
\r is a carriage return character, similar to the newline character (i.e. \n)
I faced same issue. How i fixed the issue:
Try to open the CSV file using Notepad++ (text editor)
I've seen a blank line at the end of my file, I've deleted it.
-- Hurrah, it resolved my issue.
Below URL also can help you out to resolve the issue.
http://www.thoughtspot.com/blog/5-magic-fixes-most-common-csv-file-problems
If you're on Windows, make sure to use the LINES TERMINATED BY \r\n as explained by the mariadb docs
sounds like load data local infile expects to see a value for each column.
You can edit the file by hand (to delete those rows -- could be blank lines), or you can create a temp table, insert the rows into a single column, and write a mysql command to split the rows on tab and insert the values into the target table
Make sure there are no "\"s at the end of any field. In the csv viewed as text this would look like "\," which is obviously a no-no, since that comma will be ignored so you won't have enough columns.
(This primarily applies when you don't have field encasings like quotes around each field.)

How can I load 10,000 rows of test.xls file into mysql db table?

How can I load 10,000 rows of test.xls file into mysql db table?
When I use below query it shows this error.
LOAD DATA INFILE 'd:/test.xls' INTO TABLE karmaasolutions.tbl_candidatedetail (candidate_firstname,candidate_lastname);
My primary key is candidateid and has below properties.
The test.xls contains data like below.
I have added rows starting from candidateid 61 because upto 60 there are already candidates in table.
please suggest the solutions.
Export your Excel spreadsheet to CSV format.
Import the CSV file into mysql using a similar command to the one you are currently trying:
LOAD DATA INFILE 'd:/test.csv'
INTO TABLE karmaasolutions.tbl_candidatedetail
(candidate_firstname,candidate_lastname);
To import data from Excel (or any other program that can produce a text file) is very simple using the LOAD DATA command from the MySQL Command prompt.
Save your Excel data as a csv file (In Excel 2007 using Save As) Check
the saved file using a text editor such as Notepad to see what it
actually looks like, i.e. what delimiter was used etc. Start the MySQL
Command Prompt (I’m lazy so I usually do this from the MySQL Query
Browser – Tools – MySQL Command Line Client to avoid having to enter
username and password etc.) Enter this command: LOAD DATA LOCAL INFILE
‘C:\temp\yourfile.csv’ INTO TABLE database.table FIELDS TERMINATED
BY ‘;’ ENCLOSED BY ‘”‘ LINES TERMINATED BY ‘\r\n’ (field1, field2);
[Edit: Make sure to check your single quotes (') and double quotes (")
if you copy and paste this code - it seems WordPress is changing them
into some similar but different characters] Done! Very quick and
simple once you know it :)
Some notes from my own import – may not apply to you if you run a different language version, MySQL version, Excel version etc…
TERMINATED BY – this is why I included step 2. I thought a csv would default to comma separated but at least in my case semicolon was the deafult
ENCLOSED BY – my data was not enclosed by anything so I left this as empty string ”
LINES TERMINATED BY – at first I tried with only ‘\n’ but had to add the ‘\r’ to get rid of a carriage return character being imported into the database
Also make sure that if you do not import into the primary key field/column that it has auto increment on, otherwhise only the first row will be imported
Original Author reference

Text import error

Good Day
I have created a bat file to import a text file to my MySQL database and it looks as follows:
sqlcmd /user root /pass password /db "MyDB" /command "LOAD DATA LOCAL INFILE 'file.csv' INTO TABLE TG_Orders FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'"
My problem is that I cannot get the "Treat consecutive delimiters as one" to work...
How would I add that?
Now that we have actually got to the real crux of the problem, this is not a consecutive delimiter problem - it's a CSV file format problem.
If your CSV file contains fields like B121,535 and they are not enclosed within quote marks of some kind and your delimeter is , then no amount of SQL jiggery-pokery will sort out your problem. Un-quoted fields with commas like this will always be interpreted as two separate fields unless enclosed within quote marks.
Post a sample line from the CSV file which is causing problems and we can diagnose further. Failing that, export the data from the initial system again making sure that the formatting is correct (either enclose everything in speech marks or just string fields)
Finally, are you sure that your database is MySQL based and not Microsoft SQL? The only references to SQLCMD.EXE I can find all point to Microsoft sites in relation to SQL Server Express but, even then, it has a different option structure (-U for user rather than /user). If this is the case you could have saved a lot of hassle by putting the correct information tags. If not then I would say that SQLCMD.EXE is a custom written application from somewhere and the problem could all stem from that. If that is the case then we can't help if the CSV formatting is correct - you're on your own