Export unicode data from mysql - mysql

I have some data in one of my mysql table stored as utf8. The data is some japanese text. I need to export it to excel. Could you tell how to do it?
Exporting by SELECT ... INTO OUTFILE returns some plain text file. I'm not sure how to read it back in excel so that japanese character would show properly
Thanks
Nayn

Just provide CHARACTER SET charset_name when you do SELECT INTO OUTFILE as export_option http://dev.mysql.com/doc/refman/5.0/en/select.html and you will be fine. If you will have some troubles with direct opening with Excel, have it import data using universal filter . in Open dialog. It will then let you select file encoding to import.

Actually the text file retains the data as is. It is just that if we open directly in excel, it misinterprets the character encoding for the file. I opened it using notepad and saved in encoding as 'utf8'. next time when i opened it in excel it showed the characters properly.
Thanks
Nayn

Related

Can I upload characters like ö to a MYSQL database via SSH? [duplicate]

When I use the import feature of PHPMyAdmin, it doesn't import non-ASCII characters such as ä, ö, ü, õ and the rest of the word after the characters.
When I open the CSV file with Notepad it displays the non-ASCII characters normally, but when I'm trying to import it - it doesn't work.
Entering those missing characters manually works and MySQL saves them just as it should. Any thoughts?
mySQL will do this when it encounters a character that is invalid under the current character set.
You're not mentioning what tool you are using to import the data, but you should be able to specify a character set when importing. If that character set matches the database's, everything will be fine. Also, make sure the file is actually encoded in that character set.
If your import tool doesn't offer the option of selecting the character set, you could try phpMyAdmin which does.
Make sure you know what the encoding of your CSV file is - it should be UTF-8. Then before you import, type 'use utf8', and it should work fine.

MySQL Exporting Arabic/Persian Characters

I'm new to MySQL and i'm working on it through phpMyAdmin.
My problem is that i have imported some tables with (.sql) extension into a database with: UTF8_general_ci format and it contains some Arabic or Persian characters. However, when i export these data into an Excel file, they appear as the following:
The original value: أحمد الكمالي
The exported value: أحمد  الكمالي
I have searched and looked for this issue and tried to solve it by making the output and the server connection with the same format UTF8_general_ci. But, for some reason which i don't know, the phpMyAdmin doesn't allow me to change to the same format, it forces me to chose this: UTF8mb4_general_ci
Anyway, when i export the data, i'm making sure that the format is in UTF8 but it still appears like that.
How can i solve it or fix it?
Note: Here are some screenshots if you want to check organized by numbers.
http://www.megafileupload.com/rbt5/Screenshots.rar
I found easier way that you can rebuild excel file with correct characters.
Export your data from MySQL normally in CSV format.
Open new Excel and go to Data tab.
Select "From Text".if you not find this it is under "Get External Data".
Select your file.
Change file origin to Unicode(UTF-8) and select next.("Delimited" checked by default)
Select Comma delimiter and press finish.
you will see your language characters correctly.See more
Mojibake. Probably...
The bytes you have in the client are correctly encoded in utf8mb4 (good).
You connected with SET NAMES latin1 (or set_charset('latin1') or ...), probably by default. (It should have been utf8mb4.)
The column in the tables may or may not have been CHARACTER SET utf8mb4, but it should have been that.
(utf8 and utf8mb4 work equally well for Arabic/Persian.)
Please provide more details if this explanation does not suffice.

export .csv file for DB with chinese characters

I'm looking a way to export me .csv which includes some chinese characters....
It's for woocommerce,
I've more than 1000 products in total, half in english, half in chinese, The problem I have is when I export my .csv file from excel, all the Chiense character can not be read, I got some ???? instead.
I've try to export in UTF-8 and have the same problem, does anybody know a solution with any sofwares ? It will save me couple of days . . .
Thank you so much !!
Working with both Chinese and Western character sets I have used Libre Office to export excel to .csv without issue. When you import .csv to LibreOffice you can specify the character set. You will have a drop down list and the file preview will dynamically update. This will help you find the correct set for your encoding.
It is a large download but free and cross-platform.
http://www.libreoffice.org/download/libreoffice-fresh/

Keep special characters when importing into mysql from csv

I have a csv file which contains special characters. However, when I import them into my table they do not import. I have the row that needs to contain special characters setup as utf8_general_ci
However, I am losing the special characters upon phpmyadmin import. For example, the degree symbol is not importing. I viewed my csv file in a text editor and it does contain the special characters.
Please help.
Ok, I figured how to do this. See my answer below.
Only needs change the charset in mysql when try to upload the file.
In my case works the charset windows-1252
Ok,
for anyone who has this same issue trying to keep the exported csv file in utf8 while keeping the special characters do the following.
Open up your file in OpenOffice Calc. When you get ready to save it save it as text csv file. While exporting it will ask you what format to save it as. Save it as UTF8 and presto. It does not convert your special characters like excel does.
Hope that helps someone else.
Excel usually outputs CSV as UTF-16, but there are ways to get it to output in UTF-8 as explained here
If you are on Windows you can also use the standard Notepad text editor to convert the file by selecting UTF-8 encoding in the Save As dialog as shown below:
Try to convert the charset of your csv file to utf8.
For exemple with Notepad++
Encoding -> Convert to UTF-8 without BOM
Maybe you must check setup as csv file. I think you problem begin in this file. Check this and make sure that both coding were the same(csv and sql table)

Problems importing excel data into MySQL via CSV

I have 12 excel files, each one with lots of data organized in 2 fields (columns): id and text.
Each excel file uses a diferent language for the text field: spanish, italian, french, english, german, arabic, japanese, rusian, korean, chinese, japanese and portuguese.
The id field is a combination of letters and numbers.
I need to import every excel into a different MySQL table, so one table per language.
I'm trying to do it the following way:
- Save the excel as a CSV file
- Import that CSV in phpMyAdmin
The problem is that I'm getting all sorts of problems and I can't get to import them properly, probably because of codification issues.
For example, with the Arabic one, I set everything to UTF-8 (the database table field and the CSV file), but when I do the import, I get weird characters instead of the normal arabic ones (if I manually copy them, they show fine).
Other problems I'm getting are that some texts have commas, and since the CSV file uses also commas to separate fields, in texts that are imported are truncated whenever there's a comma.
Other problems are that, when saving as CSV, the characters get messed up (like the chinese one), and I can't find an option to tell excel what encoding I want to use in the CSV file.
Is there any "protocol" or "rule" that I can follow to make sure that I do it the right way? Something that works for each different language? I'm trying to pay attention to the character encoding, but even with that I still get weird stuff.
Maybe I should try a different method instead of CSV files?
Any advice would be much appreciated.
OK, how do I solved all my issues? FORGET ABOUT EXCEL!!!
I uploaded the excels to Googledocs spreadsheets, downloaded them as CSV, and all the characters were perfect.
Then I just imported into their corresponding fields of the tables, using a "utf_general_ci" collation, and now everything is uploaded perfectly in the database.
One standard thing to do in a CSV is to enclose fields containing commas with double quotes. So
ABC, johnny cant't come out, can he?, newfield
becomes
ABC, "johnny cant't come out, can he?", newfield
I believe Excel does this if you choose to save as file type CSV. A problem you'll have is that CSV is ANSI-only. I think you need to use the "Unicode Text" save-as option and live with the tab delimiters or convert them to commas. The Unicode text option also quotes comma-containing values. (checked using Excel 2007)
EDIT: Add specific directions
In Excel 2007 (the specifics may be different for other versions of Excel)
Choose "Save As"
In the "Save as type:" field, select "Unicode Text"
You'll get a Unicode file. UCS-2 Little Endian, specifically.