I have a database that is uft8_general_ci, only problem is up until this point an application coded by a previous developer seems to have been working with the database in latin-1.
I have since changed how the app uses the database and can now store an umlaut as an umlaut instead of ü. The problem now is that the application reads the previously existing data from the database as (example) 'Süddeutsche' instead of 'Süddeutsche'.
Is there anyway to convert the data inside the database from one format to the other?
Regards
Edit:
ALTER TABLE TableName MODIFY ColumnName ColumnType CHARACTER SET latin1;
ALTER TABLE TableName MODIFY ColumnName ColumnType CHARACTER SET binary;
ALTER TABLE TableName MODIFY ColumnName ColumnType CHARACTER SET utf8;
This worked for me.
You could try SET NAMES to let the Database talk in latin-1 with your application while storing in utf-8 or you will need to convert all previous Datasets to utf-8-Strings
try
ALTER DATABASE your_db DEFAULT CHARACTER SET = 'utf8' COLLATE 'utf8_unicode_ci';
and
ALTER TABLE a CONVERT TO CHARACTER SET 'utf8' COLLATE 'utf8_unicode_ci';
ALTER TABLE b CONVERT TO CHARACTER SET 'utf8' COLLATE 'utf8_unicode_ci';
ALTER TABLE c CONVERT TO CHARACTER SET 'utf8' COLLATE 'utf8_unicode_ci';
don't forget to replace the 'ß':
UPDATE a SET field_1 = REPLACE(field_1, 'ß', 'ss') WHERE label LIKE '%ß%';
http://blog.hno3.org/2010/04/22/fixing-double-encoded-utf-8-data-in-mysql/
Using what is listed here has fixed all my problems. I used this with my live data and have had no issues!
Related
I need to change the character set on two MySQL tables to UTF8. Initially I came up with the following query but I wanted to know if this is a safe way to change the character set on (live) tables which have foreign keys. In my dev environment I just used the "FOREIGN_KEY_CHECKS = 0" without the lock but i wasn't so concerned about integrity at the time, with the live system I am.
LOCK TABLES
tags WRITE,
tags_lost WRITE;
SET FOREIGN_KEY_CHECKS = 0;
ALTER TABLE myDB.exampleTable CHARACTER SET utf8 COLLATE utf8_general_ci;
ALTER TABLE myDB.exampleTable CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
ALTER TABLE myDB.exampleTable CHARACTER SET utf8 COLLATE utf8_general_ci;
ALTER TABLE myDB.exampleTable CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
SET FOREIGN_KEY_CHECKS = 1;
UNLOCK TABLES;
The other option is to lock the tables, drop the foreign key, alter the char set, and then recreate the key but I was just hoping this shortcut would work just as well.
Don't change the charset, only do the ALTER ... CONVERT TO .... The latter does both the declaration and the data.
If you have already done both, check the data:
SELECT col, HEX(col) FROM exampleTable WHERE ...
and let's see if the hex is correct. You could have ended up with the dreaded "double encoding".
I am using Navicat 11.0.8 and I am trying to insert values into a table using a query but when i try to insert the values with the query it works but the character encoding is messed up!
As you can see on my code, the table is 'table' and i am inserting an ID, VNUM and a NAME. The VNUM is '体字' and the NAME is 'Versão'.
INSERT INTO table VALUES ('1', '体字', 'Versão');
Instead of showing '体字' on the VNUM and 'Versão' on the NAME, it shows '体å—' and 'Versão'.
This is very bad for me because I am trying to insert more than 5000 lines with alot of information.
I have tried to set the Character Encoding of the table using this these commands:
ALTER TABLE table CONVERT TO CHARACTER SET utf8 COLLATE utf8_unicode_ci;
&
ALTER TABLE table CONVERT TO CHARACTER SET big5 COLLATE big5_chinese_ci;
&
ALTER TABLE table CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
I also have tried to delete my table and create a new one with the character encoding already to utf-8 and send the values..
SET FOREIGN_KEY_CHECKS=0;
DROP TABLE IF EXISTS `table`;
CREATE TABLE `table` (
`vnum` int(11) unsigned NOT NULL default '0',
`name` varbinary(200) NOT NULL default 'Noname',
`locale_name` varbinary(24) NOT NULL default 'Noname',
) ENGINE=MyISAM DEFAULT CHARSET=big5_chinese_ci;
INSERT INTO `table` VALUES ('1', '体字', 'Versão');
Still show's '体å—' and 'Versão'.
If i edit the table manually, it show's correct! But i am not going to edit 5000+ lines...
it shows '体å—' and 'Versão'. -- Sounds like you had SET NAMES latin1. Do
SET NAMES utf8;
after connecting and before INSERTing. That tells mysqld what encoding the client is using.
Verify the data stored by doing SELECT col, HEX(col) ... The utf8 (or utf8mb4) hex for 体字 is E4BD93E5AD97. Interpreting those same bytes as latin1 gives you 体å—;
utf8 can handle those, plus ã; I don't know if big5 can.
Actually, I suggest you use utf8mb4 instead of utf8. This is in case you encounter some of the 4-byte Chinese characters.
If you still have latin1 columns that need changing to utf8mb4, see my blog, which discusses the "2-step ALTER", but using BINARY or BLOB (not big5) as the intermediate.
I am using Django on Bluehost. I created a form for user generated input, but unicode inputs from this form fails to be stored or displayed of characters. So I did a SO and google search that I should change the Collate and Character set of my database. I run this sql
ALTER DATABASE learncon_pywithyou CHARACTER SET utf8 COLLATE utf8_unicode_ci;
from python27 manage.py dbshell, which initiated a mysql shell, what shows on screen is
Query OK, 1 row affected (0.00 sec).
So I assume the problem is solved, but it is not actually. This sql has not done anything, as I later find it in phpMyAdmin provided by Bluehost. All the Varchar fields of all the tables are still in lantin1_swedish_ci collate.
So assume that alter table should work instead. I run this on mysql
alter table mytable character set utf8 collate utf8_unicode_ci;
although on screen it shows Query OK. 4 rows affected, it actually did nothing either, the collate of those fields in mytable did not change at all.
So I finally manually change the fields in phpMyAdmin for mytable and this works, now I am able to insert in this table with unicode and also they display correctly, but I have around 20 tables of such, I don't want to change them one by one manually.
Do we at all have a simple and effective way of changing Collate of each field to store and display correct unicodes?
Changing collation at the database level sets the default for new objects - existing collations will not be changed.
Similarly, at a table level, only new columns (See comment at the bottom) are affected with this:
alter table mytable character set utf8 collate utf8_unicode_ci;
However, to convert the collation of existing columns, you need to add convert to:
alter table mytable convert to character set utf8 collate utf8_unicode_ci;
In addition to #StuartLC ,
For Changing All 20 tables charset and collation use below query, Here world is database name
SELECT
CONCAT("ALTER TABLE ",TABLE_SCHEMA , ".",TABLE_NAME," CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci") AS AlterSQL
FROM information_schema.TABLES
WHERE TABLE_SCHEMA = "world";
The above will prepare all ALTER queries which you need to run.
I am asked to set all table's default COLLATE to utf8_bin. How to do this?
How to do this?
One at a time, I'm afraid...
alter table <some_table> convert to character set utf8 collate utf8_bin;
There's no bulk method, unless you feel like using mysqldump to get the whole DB out, edit the resulting dump to add the required collation and then reimporting the whole DB again.
SELECT CONCAT("ALTER TABLE ", TABLE_NAME," COLLATE utf8_bin") AS String
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_SCHEMA="YourDatabaseName"
AND TABLE_TYPE="BASE TABLE"
One query to rule all,just run the result of this query.
ALTER TABLE <table name> COLLATE utf8_bin;
If you also need to update the existing character encoding (unlikely by the sounds of things), you can use:
ALTER TABLE <table name> CONVERT TO CHARACTER SET utf8 COLLATE utf8_bin;
In /etc/my.cnf the following has been added
character-set-server=utf8
collation-server=utf8_general_ci
But for the database and tables created before adding the above how to convert the database and tables to utf8 with collation settings
Well, the database character set and table character set are just defaults (they don't affect anything directly). You'd need to modify each column to the proper charset. PHPMyAdmin will do this for you (just edit the column, then change the character set). If you want to do raw SQL, you'll need to know the column definition (SHOW CREATE TABLE foo will show you the definition). Then, you can use ALTER TABLE to change the definition.
To change the default charset for a table:
ALTER TABLE `tablename` DEFAULT CHARACTER SET 'utf8' COLLATE 'utf8_general_ci';
To change the charset of a column with the definition `foo VARCHAR(128) CHARACTER SET 'foo' COLLATE 'foo'``:
ALTER TABLE `tablename` MODIFY
`foo` VARCHAR(128) CHARACTER SET 'utf8' COLLATE 'utf8_general_ci';
https://serverfault.com/questions/65043/alter-charset-and-collation-in-all-columns-in-all-tables-in-mysql
And:
http://www.mysqlperformanceblog.com/2009/03/17/converting-character-sets/