MySQL is adding characters to my data - mysql

I have a Table in MySQL and I am adding data to it from a csv file
My code is:
LOAD DATA LOCAL INFILE 'C:/myaddress/file.csv'
INTO TABLE db.mytable
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
(`Currency`,`field2`,`field3`)
This loads fine except for the first row I add.
I'm adding a USD but when I do a query it imports it as USD.
This only happens for the first row. Anybody knows why this happens?
Solution:
This is an encoding issue, so to solve it, here are two options
1.- Encode the file differently
2.- Add a dummy line and use IGNORE 1 LINES

I think this is something related with your csv. It has been encoded with UTF-8 BOM in ISO-8859-1 (spanish?).
If you are using a editor like Notepad++ open your csv and select from the top menu -> Encoding -> utf-8 without DOM , save and try again.

Related

Is there a way to import a CSV comma delimited file into mysql?

Hi there I am new to web development.
I am trying to import a CSV file into mysql workbench using 'Table Data Import Wizard'. However, I have read my file needs to be a CSV (MS-DOS), or I get the following error: Can't analyze file. Please try to change encoding type. If that doesn't help, maybe the file is not: csv, or the file is empty.
I cannot use a CSV (MS-DOS) as my data contains a lot of different special characters including those from Nordic Europe. When I convert my CSV (comma delimited) to CSV (MS-DOS) the special characters are no longer the same.
Is there a way to import a CSV comma delimited file into mysql workbench? Or is there a better solution to getting my data into the table such as keeping the special characters the same in the MS-DOS file somehow?
You can import regular CSVs without an issue, just make sure the encoding matches.
Something like
LOAD DATA
INFILE yourfile.csv
INTO TABLE tablename
FIELDS
TERMINATED BY ','
ENCLOSED BY '"'
LINES
TERMINATED BY '\n'
IGNORE 1 LINES
should work. If your CSV doesn't have headers, remove the ignore 1 lines line from the code. If your formatting is different, change the enclosing and terminating characters accordingly.
You can look up the exact syntax in the manual.
Your CSV should work fine. You just need to Load Data Infile
You will likely need to define these settings though
LOAD DATA INFILE 'c:/tmp/discounts.csv'
INTO TABLE discounts
-- comma seperated? maybe pipes '|'?
FIELDS TERMINATED BY ','
-- what surrounds input and is it optional? then add OPTIONALLY before ENCLOSED
ENCLOSED BY '"'
-- what is at the end of files
LINES TERMINATED BY '\n'
-- how many header rows are there, if any?
IGNORE 1 ROWS;

How to correct strange character to be normal when importing '.csv' file in My SQL query browser?

I am facing big problem to correct strange chinese character to be normal when importing '.csv' file in My SQL query browser. When I import the CSV data, it will show me like below the picture:
The database has already change to UTF-8 format, but it show me the strange chinese character:
enter image description here
My SQL query like the below:
LOAD DATA LOCAL INFILE
'c:/2019/countries20.csv'
INTO TABLE countries
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id,name,country_code,language);
My CSV file info like the below picture, I have change CSV file encode to UTF-8:
enter image description here
Anyone can help me find which part I get wrong? Thanks.
Include in the LOAD DATA: CHARACTER SET utf8mb4.
Declare the target column to be CHARACTER SET utf8mb4.

How to remove line break from LOAD DATA LOCAL INFILE query?

I am using this query to import data from a txt file into my table:
LOAD DATA LOCAL INFILE '~/Desktop/data.txt' INTO TABLE codes LINES TERMINATED BY '\n' (code)
This is working fine. But when I take a look in the "code"-field, every entry has a line break at its end. Is there a way to get rid of this?
Load data infile command is not really suitable for data cleansing, but you may get lucky. First of all, determine what characters exactly make up those 'line breaks'.
It is possible, that the text file uses Windows style line breaks (\r\n). In this case use lines terminated by '\r\n'. If the line breaks consist of different characters, but are consistent across all lines, then include those in the line terminated by clause.
If the line break characters are inconsistent, then you may have to create a stored procedure or use an external programming language to cleanse your data.

Row does not contain data for all columns

Im trying to import a text file containing:
http://pastebin.com/qhzrq3M7
Into my database using the command
Load data local infile 'C:/Users/Gary/Desktop/XML/jobs.txt'
INTO Table jobs
fields terminated by '\t';
But I keep getting the error Row 1-13 doesn't contain data for all columns
Make sure the last field of each row ends with \t. Alternatively, use LINES TERMINATED BY
LOAD DATA LOCAL INFILE 'C:/Users/Gary/Desktop/XML/jobs.txt' INTO TABLE jobs COLUMNS TERMINATED BY '\t' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r';
\r is a carriage return character, similar to the newline character (i.e. \n)
I faced same issue. How i fixed the issue:
Try to open the CSV file using Notepad++ (text editor)
I've seen a blank line at the end of my file, I've deleted it.
-- Hurrah, it resolved my issue.
Below URL also can help you out to resolve the issue.
http://www.thoughtspot.com/blog/5-magic-fixes-most-common-csv-file-problems
If you're on Windows, make sure to use the LINES TERMINATED BY \r\n as explained by the mariadb docs
sounds like load data local infile expects to see a value for each column.
You can edit the file by hand (to delete those rows -- could be blank lines), or you can create a temp table, insert the rows into a single column, and write a mysql command to split the rows on tab and insert the values into the target table
Make sure there are no "\"s at the end of any field. In the csv viewed as text this would look like "\," which is obviously a no-no, since that comma will be ignored so you won't have enough columns.
(This primarily applies when you don't have field encasings like quotes around each field.)

How to import this file generated by mysql phpmyadmin export back into the table?

I did an sql export of a mysql table in phpmyadmin and was given a textfile which looks like this:
"property";"367158";"4012";"5";"sold";"2013-02-06 05:40:27"
"property";"367159";"4013";"5";"sold";"2013-02-06 09:51:52"
Notice the lack of semi-colon at the end of the row.
I have hundreds of these rows like above. How do I easily import these data entries back into the table?
What you have is a delimited file. I believe that this format is supported by phpmyadmin import. Choose delimited format, set the delimiter to ';' and do the import.
in phpmyadmin go to import tab and there you will see options like fields separated by,etc mark them according to the file and click ok
I have done something similar long ago, try something like this in mysql command line:
LOAD DATA LOCAL INFILE 'C:/workbench/test-file.csv' INTO TABLE properties
FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
If does not work you can modify according to you needs such as end of line, file name, file path, etc.
For more info have a look:
http://dev.mysql.com/doc/refman/5.5/en/load-data.html