I'm looking a way to export me .csv which includes some chinese characters....
It's for woocommerce,
I've more than 1000 products in total, half in english, half in chinese, The problem I have is when I export my .csv file from excel, all the Chiense character can not be read, I got some ???? instead.
I've try to export in UTF-8 and have the same problem, does anybody know a solution with any sofwares ? It will save me couple of days . . .
Thank you so much !!
Working with both Chinese and Western character sets I have used Libre Office to export excel to .csv without issue. When you import .csv to LibreOffice you can specify the character set. You will have a drop down list and the file preview will dynamically update. This will help you find the correct set for your encoding.
It is a large download but free and cross-platform.
http://www.libreoffice.org/download/libreoffice-fresh/
Related
I have a SAS table that contains hundreds of thousands of rows and several text fields and I need to import this table into and ACCESS database.
The fields contains names in Hebrew characters and special characters such as commas, colons, brackets, quotes, double quotes and any other character you can think of.
I've tried exporting the table as a CSV file and importing it into my ACCESS database and encountered 2 issues:
Access does not recognize the Hebrew characters
Every time there is a special character that is also defined as a delimiter in the access import query, the data is read incorrectly.
Any ideas?
Im using SAS 9.2 and ACCESS 2010 on Windows XP. I'll probably be upgrading to Windows 7 and SAS 9.4 soon so I can have integrated connectivity between ACCESS and SAS. Anyone knows if it solves those problems?
Thanks.
Okay folks, i found the answer, and its really simple.
Instead of exporting to a CSV file and then to Access, there is an option of exporting data directly from SAS to an Access database (somehow I missed it before...).
Seems to work well. It keeps the Hebrew characters and doesn't mess the data. The SAS table and the ACCESS table are not linked, but that's not an issue in my current application.
Code used:
`
PROC EXPORT DATA=lib.table
OUTTABLE= "table1"
DBMS=ACCESS REPLACE;
DATABASE= "L:\test.accdb";
RUN;
`
I am using phpmyadmin(Version : 4.6.4) as a platform to import CSV encoded with UTF-8 to database. I am able to import the data, but with no idea why is the first two character of at first-column first-row went missing and this happens every time i import a CSV.
raw: A1610011111-001,N,N,N,N,N,N,N,N,N,N,N,N,N,--
This is what the data supposed to be -> (A1)
CSV data
This is the imported data (A1 went missing)->imported data
If the data is more than one row, the result will be same(only 1st two character went missing)
I am not sure what is the problem and what is the solution. Please give me a hand on this.
Well for anyone still searching for an answer, this is what worked for me after numerous tries.
In Excel (365), you can chose between
save as csv UTF-8 (sperated by comma's)
save as csv (seperated by delimiter)
As contradictory as it may seem, when I use the first option, I lose my first 2 characters, whereas choosing the 2nd. option I lose nothing.
So saving without the UTF-8 seems to do the trick.
I have a csv file which contains special characters. However, when I import them into my table they do not import. I have the row that needs to contain special characters setup as utf8_general_ci
However, I am losing the special characters upon phpmyadmin import. For example, the degree symbol is not importing. I viewed my csv file in a text editor and it does contain the special characters.
Please help.
Ok, I figured how to do this. See my answer below.
Only needs change the charset in mysql when try to upload the file.
In my case works the charset windows-1252
Ok,
for anyone who has this same issue trying to keep the exported csv file in utf8 while keeping the special characters do the following.
Open up your file in OpenOffice Calc. When you get ready to save it save it as text csv file. While exporting it will ask you what format to save it as. Save it as UTF8 and presto. It does not convert your special characters like excel does.
Hope that helps someone else.
Excel usually outputs CSV as UTF-16, but there are ways to get it to output in UTF-8 as explained here
If you are on Windows you can also use the standard Notepad text editor to convert the file by selecting UTF-8 encoding in the Save As dialog as shown below:
Try to convert the charset of your csv file to utf8.
For exemple with Notepad++
Encoding -> Convert to UTF-8 without BOM
Maybe you must check setup as csv file. I think you problem begin in this file. Check this and make sure that both coding were the same(csv and sql table)
I have 12 excel files, each one with lots of data organized in 2 fields (columns): id and text.
Each excel file uses a diferent language for the text field: spanish, italian, french, english, german, arabic, japanese, rusian, korean, chinese, japanese and portuguese.
The id field is a combination of letters and numbers.
I need to import every excel into a different MySQL table, so one table per language.
I'm trying to do it the following way:
- Save the excel as a CSV file
- Import that CSV in phpMyAdmin
The problem is that I'm getting all sorts of problems and I can't get to import them properly, probably because of codification issues.
For example, with the Arabic one, I set everything to UTF-8 (the database table field and the CSV file), but when I do the import, I get weird characters instead of the normal arabic ones (if I manually copy them, they show fine).
Other problems I'm getting are that some texts have commas, and since the CSV file uses also commas to separate fields, in texts that are imported are truncated whenever there's a comma.
Other problems are that, when saving as CSV, the characters get messed up (like the chinese one), and I can't find an option to tell excel what encoding I want to use in the CSV file.
Is there any "protocol" or "rule" that I can follow to make sure that I do it the right way? Something that works for each different language? I'm trying to pay attention to the character encoding, but even with that I still get weird stuff.
Maybe I should try a different method instead of CSV files?
Any advice would be much appreciated.
OK, how do I solved all my issues? FORGET ABOUT EXCEL!!!
I uploaded the excels to Googledocs spreadsheets, downloaded them as CSV, and all the characters were perfect.
Then I just imported into their corresponding fields of the tables, using a "utf_general_ci" collation, and now everything is uploaded perfectly in the database.
One standard thing to do in a CSV is to enclose fields containing commas with double quotes. So
ABC, johnny cant't come out, can he?, newfield
becomes
ABC, "johnny cant't come out, can he?", newfield
I believe Excel does this if you choose to save as file type CSV. A problem you'll have is that CSV is ANSI-only. I think you need to use the "Unicode Text" save-as option and live with the tab delimiters or convert them to commas. The Unicode text option also quotes comma-containing values. (checked using Excel 2007)
EDIT: Add specific directions
In Excel 2007 (the specifics may be different for other versions of Excel)
Choose "Save As"
In the "Save as type:" field, select "Unicode Text"
You'll get a Unicode file. UCS-2 Little Endian, specifically.
I have some data in one of my mysql table stored as utf8. The data is some japanese text. I need to export it to excel. Could you tell how to do it?
Exporting by SELECT ... INTO OUTFILE returns some plain text file. I'm not sure how to read it back in excel so that japanese character would show properly
Thanks
Nayn
Just provide CHARACTER SET charset_name when you do SELECT INTO OUTFILE as export_option http://dev.mysql.com/doc/refman/5.0/en/select.html and you will be fine. If you will have some troubles with direct opening with Excel, have it import data using universal filter . in Open dialog. It will then let you select file encoding to import.
Actually the text file retains the data as is. It is just that if we open directly in excel, it misinterprets the character encoding for the file. I opened it using notepad and saved in encoding as 'utf8'. next time when i opened it in excel it showed the characters properly.
Thanks
Nayn