MySql Ñ character when importing from csv file - mysql

I'm trying to import a csv file to a table, the names in the csv file contains the character 'Ñ'. Once the importing is done all 'Ñ' characters are replaced by � and it gives me errors when I apply changes.
I can replace it 1 by 1 but there's 247 of them.
If it's possible is there anyway I can import the file using a syntax where I can add something like Replace all � with Ñ
edit: I'm just using mysql workbench.
Thanks.

Related

Can I upload characters like ö to a MYSQL database via SSH? [duplicate]

When I use the import feature of PHPMyAdmin, it doesn't import non-ASCII characters such as ä, ö, ü, õ and the rest of the word after the characters.
When I open the CSV file with Notepad it displays the non-ASCII characters normally, but when I'm trying to import it - it doesn't work.
Entering those missing characters manually works and MySQL saves them just as it should. Any thoughts?
mySQL will do this when it encounters a character that is invalid under the current character set.
You're not mentioning what tool you are using to import the data, but you should be able to specify a character set when importing. If that character set matches the database's, everything will be fine. Also, make sure the file is actually encoded in that character set.
If your import tool doesn't offer the option of selecting the character set, you could try phpMyAdmin which does.
Make sure you know what the encoding of your CSV file is - it should be UTF-8. Then before you import, type 'use utf8', and it should work fine.

PHPMYADMIN exported UTF-16 Character set contains ? character in the first field

I exported in PHPMyAdmin option using UTF-16 Character set. (I used this character set because other character sets truncate the text field) It's look like below.
"Add","33649","CIPA 49704, 5",,"<ul>
The export file saved as CSV file. I opened another excel and delimited with Comma (,) and save the result in CSV file. And i opened the CSV file and it look like below
?"Add" 33649 CIPA 49704, 5 "<ul>
Why this ? Symbol is appearing? I am confused.

Keep special characters when importing into mysql from csv

I have a csv file which contains special characters. However, when I import them into my table they do not import. I have the row that needs to contain special characters setup as utf8_general_ci
However, I am losing the special characters upon phpmyadmin import. For example, the degree symbol is not importing. I viewed my csv file in a text editor and it does contain the special characters.
Please help.
Ok, I figured how to do this. See my answer below.
Only needs change the charset in mysql when try to upload the file.
In my case works the charset windows-1252
Ok,
for anyone who has this same issue trying to keep the exported csv file in utf8 while keeping the special characters do the following.
Open up your file in OpenOffice Calc. When you get ready to save it save it as text csv file. While exporting it will ask you what format to save it as. Save it as UTF8 and presto. It does not convert your special characters like excel does.
Hope that helps someone else.
Excel usually outputs CSV as UTF-16, but there are ways to get it to output in UTF-8 as explained here
If you are on Windows you can also use the standard Notepad text editor to convert the file by selecting UTF-8 encoding in the Save As dialog as shown below:
Try to convert the charset of your csv file to utf8.
For exemple with Notepad++
Encoding -> Convert to UTF-8 without BOM
Maybe you must check setup as csv file. I think you problem begin in this file. Check this and make sure that both coding were the same(csv and sql table)

PHPMyAdmin doesn't import unicode

When I use the import feature of PHPMyAdmin, it doesn't import non-ASCII characters such as ä, ö, ü, õ and the rest of the word after the characters.
When I open the CSV file with Notepad it displays the non-ASCII characters normally, but when I'm trying to import it - it doesn't work.
Entering those missing characters manually works and MySQL saves them just as it should. Any thoughts?
mySQL will do this when it encounters a character that is invalid under the current character set.
You're not mentioning what tool you are using to import the data, but you should be able to specify a character set when importing. If that character set matches the database's, everything will be fine. Also, make sure the file is actually encoded in that character set.
If your import tool doesn't offer the option of selecting the character set, you could try phpMyAdmin which does.
Make sure you know what the encoding of your CSV file is - it should be UTF-8. Then before you import, type 'use utf8', and it should work fine.

MySQL LOAD DATA INFILE with fields terminated by non-ASCII character

I have a lowercase thorn separated file that I need to load into a MySQL database (5.1.54) using the LOAD DATA INFILE ... query.
The file I'm trying to load is located on the same server as the MySQL database, and I'm issuing the query from a Windows machine using SQLYog, which uses the MySQL C client library.
I'm having some major issues, I've tried using the FIELDS TERMINATED BY 0x00FE syntax using all the variations of the thorn character I can think of, and I've tried changing the character set of the connection (SET NAMES ...), but I consistently get the warning...
Warning Code : 1638
Non-ASCII separator arguments are not fully supported
...and all the data loads into the first column.
Is there any way around this at all? Or am I resigned to pre-processing the file with sed to replace all the thorn's with a more sensible character before loading?
I have succeeded to load this data with Data Import tool (CSV format) in dbForge Studio for MySQL. I just set 'Þ' as custom delimiter. The import from the CSV format is fully supported in free Express Edition.
I decided to fix the file by replacing the non-ASCII character with a character that MySQL's LOAD DATA INFILE ... would understand.
Use od to get the octal byte value of the offending character - od -b file.log - in this case it's 376.
Use grep to make sure the character you want to replace it with doesn't already exist in the file - grep -n '|' file.log.
Use sed and printf to replace the non-ASCII character - sed -i 's/'$(printf '\376')'/|/g' file.log.