MySQL: data being mangled while changing column to UTF8 - mysql

I am migrating a MySQL database that was created in LATIN1 to UTF8. To do that, I am first changing each column to the corresponding binary type and then to UTF8:
ALTER TABLE clientes CHARACTER SET utf8;
ALTER TABLE clientes change nombre nombre varbinary(255);
ALTER TABLE clientes change nombre nombre varchar(255) character set utf8;
Since, according to all docs, that's the right way to prevent data from being mangled...
...However, the data still gets mangled. I'll just put two examples:
The word "Larrasoaña" gets truncated at the "ñ", with the error: Warning: #1366 Incorrect string value: '\xF1a' for column 'nombre'
The word "Jesús y María" gets truncated at the "ú", with the error Warning: #1366 Incorrect string value: '\xFAs y M...' for column 'nombre'
How did that data get in? Well, the DB is the backend to a PHP web app, which used UTF8 for everything (including connecting to the MySQL server with "SET NAMES UTF8")... except creating the database properly. So I assume all the data that was added in was in UTF8.
To summarize: it seems that I had UTF8 text stored in LATIN1 columns, and now that I try to change the columns to UTF8, the text gets truncated.
Why is this happening? What can I do?
EDIT: forgot to mention, I'm doing all of this from PhpMyAdmin, since I don't have command line access.

F1 and FA are latin1 encodings. You need to tell MySQL that the data is latin1. One way is via SET NAMES latin1.
But note... That is independent of the setting for the column you are trying to store the data into. And, these days, utf8mb4 is the preferred setting for text. MySQL will convert between the column's encoding and the client's encoding. But you must tell it the client's encoding via connection parameters (or SET NAMES).
The pair of ALTER TABLEs works for certain situations, not all situations! You probably wanted the first entry in http://mysql.rjweb.org/doc.php/charcoll#fixes_for_various_cases
Table is CHARACTER SET latin1 and correctly encoded in latin1; want
utf8mb4:
ALTER TABLE tbl CONVERT TO CHARACTER SET utf8mb4;
I don't happen to know if your data is irreparably hosed. Please provide one of the lines, together with HEX.
Hex
"Larrasoaña" is encoded as 4C61727261736F61F161, and "Jesús y María" as 4A6573FA732079204D6172ED6120
Those are latin1-encoded (or latin5 or dec8). If the table definition (SHOW CREATE TABLE) says latin1, then you could leave it alone. (latin1 handles Western European languages, but not Asian.)
If you want to convert all the text columns to utf8 or utf8mb4, do an ALTER like the one I presented above. Your 3-Alter approach will not work correctly; it assumes the bytes in the latin1 column are really UTF-8 bytes (which they aren't).
But... You must specify the client's encoding based on what the client wants. And it does not matter whether the client and the table agree since conversion will be provided.
Why the 3-step Alter fails
ALTER TABLE clientes CHARACTER SET utf8; -- This sets the default charset for new columns. It has no effect on the existing column definitions and any data in those columns.
ALTER TABLE clientes change nombre nombre varbinary(255); -- This says "forget about any text encoding". That is F1 is now just a bunch of bits, not the latin1 representation for ñ.
ALTER TABLE clientes change nombre nombre varchar(255) character set utf8; -- This takes those varbinary bits and says "let's treat them as utf8. And that gives the error message because F1 is not a valid encoding for utf8.
That procedure is appropriate if the bytes are already utf8 bytes. That is, if it were already the 2-byte C3B1 for ñ. (By the way, this usually manifests itself as 'Mojibake', displaying as ñ when interpreted as latin1.)
The 1-Alter procedure...
ALTER TABLE clientes CONVERT TO CHARACTER SET utf8; (to convert the entire table) or ALTER TABLE clientes MODIFY nombre varchar(255) character set utf8; (to convert just one column). They do the following things:
For each text (char/varchar/text) column, it reads the data according to its current encoding (latin1, F1), converts it to utf8 (or utf8mb4) (C3B1) and writes back into the row. Meanwhile, it has changed the declaration to be CHARACTER SET utf8.
That is, it is the 'right' process for changing the CHARACTER SET without changing the "text". True, the encoding changed (F1 -> C3B1), but that is in keeping with the change to the CHARACTER SET.
Recovery
Your first 2 ALTERs worked, correct? Did the 3rd one succeed, fail, or leave a messed up table?
If it aborted, leaving varbinary in place, then do 2 more alters: First go back to latin1; then go straight to utf8.
If it left you with a messed up column, especially if rows are truncated, then you need to go back to a backup, or otherwise reload the data.

Related

Incorrect string value error for unconventional characters

So I'm using a wrapper to fetch user data from instagram. I want to select the display names for users, and store them in a MYSQL database. I'm having issues inserting some of the display names, dealing with, specifically, an incorrect string value error:
Now, I've dealt with this issue before with accent marks, letters with umlauts, etc. The solution would be to change the collation to utf8_general_ci under the utf8 charset.
So as you can see, some of the display names I'm pulling have very unique characters that I'm not sure mySQL can recognize at all, i.e.:
ᛘ𝕰𝖆𝖗𝖙𝖍 𝕾𝖕𝖎𝖗𝖎𝖙𝖚𝖘𐂂®
So I receive:
Error Code: 1366. Incorrect string value: '\xF0\x9D\x99\x87\xF0\x9D...' for column 'dummy' at row 1
Here's my sql code
CREATE TABLE test_table(
id INT AUTO_INCREMENT,
dummy VARCHAR(255),
PRIMARY KEY(id)
);
INSERT INTO test_table (dummy)
VALUES ('ᛘ𝕰𝖆𝖗𝖙𝖍 𝕾𝖕𝖎𝖗𝖎𝖙𝖚𝖘𐂂®');
Any thoughts on a proper charset + collation pair that can handle characters like this? Not sure where to look for a solution, so I come here to see if anyone dealt with this.
P.S., I've tried utf8mb4 charset with utf8mb4_unicode_ci and utf8mb4_bin collations as well.
The characters you show require the column use the utf8mb4 encoding. Currently it seems your column is defined with the utf8mb3 encoding.
The way MySQL uses the name "utf8" is complicated, as described in https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8mb3.html:
Note
Historically, MySQL has used utf8 as an alias for utf8mb3;
beginning with MySQL 8.0.28, utf8mb3 is used exclusively in the output
of SHOW statements and in Information Schema tables when this
character set is meant.
At some point in the future utf8 is expected to become a reference to
utf8mb4. To avoid ambiguity about the meaning of utf8, consider
specifying utf8mb4 explicitly for character set references instead of
utf8.
You should also be aware that the utf8mb3 character set is deprecated
and you should expect it to be removed in a future MySQL release.
Please use utf8mb4 instead.
You may have tried to change your table in the following way:
ALTER TABLE test_table CHARSET=utf8mb4;
But that only changes the default character set, to be used if you add new columns to the table subsequently. It does not change any of the current columns. To do that:
ALTER TABLE test_table MODIFY COLUMN dummy VARCHAR(255) CHARACTER SET utf8mb4;
Or to convert all string or TEXT columns in a table in one statement:
ALTER TABLE test_table CONVERT TO CHARACTER SET utf8mb4;
That would be 𝙇 - L MATHEMATICAL SANS-SERIF BOLD ITALIC CAPITAL L
It requires the utf8mb4 Character set to even represent it. "F0" is the clue; it is the first of 4 bytes in a 4-byte UTF-8 character. It cannot be represented in MySQL's "utf8". Collation is (mostly) irrelevant.
Most, not all, of the characters in ᛘ𝕰𝖆𝖗𝖙𝖍 𝕾𝖕𝖎𝖗𝖎𝖙𝖚𝖘𐂂® also need utf8mb4. They are "MATHEMATICAL BOLD FRAKTUR" letters.
(Meanwhile, Bill gives you more of an answer.)

Utf8 and latin1

Half of the tables in a database which has alot of data are set to Latin 1.
Within these Latin 1 tables most rows are set to utf8 if the row is expecting text input (anything not a integer).
Everything is in English .
How bad is my situation if I need to convert these Latin 1 tables to utf8?
First, a clarification: You said "most rows are set to utf8"; I assume you meant "most columns are set to utf8"?
The meaning of the latin1 on the table is just a default. It has no impact on performance, etc.
The only "harm" occurs if you do ALTER TABLE .. ADD COLUMN .. without specifying CHARACTER SET utf8.
You say that all the text is English? Then there is no encoding difference between latin1 and utf8. Problems can occur when you have accented letters, etc.
There is one performance issue: If you JOIN two tables together on a VARCHAR column, but the CHARACTER SET or COLLATION differs for that column in the two tables, it will be slower than if those settings are the same. (It does not sound like you have this problem.) Again, note that the table's default is not relevant, only the column settings themselves.
Yes, it would be 'cleaner' to have the table default set to utf8. This should be a way to do it without dumping, but by changing the tables one at a time:
ALTER TABLE t CONVERT TO CHARACTER SET utf8;
That will change the table default and any columns that are not already utf8.
mysqldump --add-drop-table database_to_correct | replace CHARSET=latin1 CHARSET=utf8 | iconv -f latin1 -t utf8 | mysql database_to_correct

MySQL character encoding change. Is data integrity preserved?

I will have to convert the database encoding from latin-1 to utf-8.
I'm aware of the fact that converting the database is done via the command of
ALTER DATABASE db_name
[[DEFAULT] CHARACTER SET charset_name]
[[DEFAULT] COLLATE collation_name]
Source and converting an existing table is done via the command of
ALTER TABLE tbl_name
[[DEFAULT] CHARACTER SET charset_name]
[COLLATE collation_name]
Source.
However, the database is already existent and there is sensitive information involved. My question is whether the data I already have will be changed. The purpose of this question is that I have to give an estimate before I do the change.
Every (character string-type) column has its own character set and collation metadata.
If, when the column's data type was specified (i.e. when it was last created or altered), no character set/collation was explicitly given, then the table's default character set and collation would be used for the column.
If, when the table was specified, no default character set/collation was explicitly given, then the database's default character set and collation would be used for the table's default.
The commands that you quote in your question merely alter such default character sets/collations for the database and table respectively. In other words, they will only affect tables and columns that are created thereafter—they will not affect existing columns (or data).
To update existing data, you should first read the Changing the Character Set section of the manual page on ALTER TABLE:
Changing the Character Set
To change the table default character set and all character columns (CHAR, VARCHAR, TEXT) to a new character set, use a statement like this:
ALTER TABLE tbl_name CONVERT TO CHARACTER SET charset_name;
The statement also changes the collation of all character columns. If you specify no COLLATE clause to indicate which collation to use, the statement uses default collation for the character set. If this collation is inappropriate for the intended table use (for example, if it would change from a case-sensitive collation to a case-insensitive collation), specify a collation explicitly.
For a column that has a data type of VARCHAR or one of the TEXT types, CONVERT TO CHARACTER SET changes the data type as necessary to ensure that the new column is long enough to store as many characters as the original column. For example, a TEXT column has two length bytes, which store the byte-length of values in the column, up to a maximum of 65,535. For a latin1 TEXT column, each character requires a single byte, so the column can store up to 65,535 characters. If the column is converted to utf8, each character might require up to three bytes, for a maximum possible length of 3 × 65,535 = 196,605 bytes. That length does not fit in a TEXT column's length bytes, so MySQL converts the data type to MEDIUMTEXT, which is the smallest string type for which the length bytes can record a value of 196,605. Similarly, a VARCHAR column might be converted to MEDIUMTEXT.
To avoid data type changes of the type just described, do not use CONVERT TO CHARACTER SET. Instead, use MODIFY to change individual columns. For example:
ALTER TABLE t MODIFY latin1_text_col TEXT CHARACTER SET utf8;
ALTER TABLE t MODIFY latin1_varchar_col VARCHAR(M) CHARACTER SET utf8;
If you specify CONVERT TO CHARACTER SET binary, the CHAR, VARCHAR, and TEXT columns are converted to their corresponding binary string types (BINARY, VARBINARY, BLOB). This means that the columns no longer will have a character set attribute and a subsequent CONVERT TO operation will not apply to them.
If charset_name is DEFAULT in a CONVERT TO CHARACTER SET operation, the character set named by the character_set_database system variable is used.
 Warning
The CONVERT TO operation converts column values between the original and named character sets. This is not what you want if you have a column in one character set (like latin1) but the stored values actually use some other, incompatible character set (like utf8). In this case, you have to do the following for each such column:
ALTER TABLE t1 CHANGE c1 c1 BLOB;
ALTER TABLE t1 CHANGE c1 c1 TEXT CHARACTER SET utf8;
The reason this works is that there is no conversion when you convert to or from BLOB columns.
To change only the default character set for a table, use this statement:
ALTER TABLE tbl_name DEFAULT CHARACTER SET charset_name;
The word DEFAULT is optional. The default character set is the character set that is used if you do not specify the character set for columns that you add to a table later (for example, with ALTER TABLE ... ADD column).
When the foreign_key_checks system variable is enabled, which is the default setting, character set conversion is not permitted on tables that include a character string column used in a foreign key constraint. The workaround is to disable foreign_key_checks before performing the character set conversion. You must perform the conversion on both tables involved in the foreign key constraint before re-enabling foreign_key_checks. If you re-enable foreign_key_checks after converting only one of the tables, an ON DELETE CASCADE or ON UPDATE CASCADE operation could corrupt data in the referencing table due to implicit conversion that occurs during these operations (Bug #45290, Bug #74816).

Are there any downside effects when changing mysql table encoding?

I have a table encoded in latin1 and collate latin1_bin.
In my table there is a column comments of type 'TEXT', as you know this column inherits table's encoding and collation, but from now on I should change it to be utf8 and utf8_general_ci because I'm starting to store special characters in comments.
Would it cause any downside effect if I'd use a command like the following?
alter table notebooks modify comments text CHARACTER SET utf8 COLLATE utf8_general_ci;
Thank you for your answer.
Danger I think that that ALTER will destroy existing text.
Also, ... Your 'name' looks Chinese, so I would guess that you want to store Chinese characters? In that case, you should use utf8mb4, not just utf8. This is because some of the Chinese characters take 4 bytes (and are not in the Unicode BMP).
I believe you need 2 steps:
ALTER TABLE notebooks MODIFY comments BLOB;
ALTER TABLE notebooks MODIFY comments TEXT
CHARACTER SET utf8mb4 COLLATE utf8mb4_general_520_ci;
Otherwise the latin1 characters will be "converted" to ut8. But if you really have Chinese in the column, you do not have latin1. The 2-step alter, above, does (1) turn off any knowledge of character set, and (2) establish that the bytes are really utf8mb4-encoded.
To be safer, first do
RENAME TABLE notebooks TO old;
CREATE TABLE notebooks LIKE old;
INSERT INTO notebooks SELECT * FROM old;
Then do the two ALTERs and test the result. If there is trouble, you can RENAME to get back the old copy.
Specifying any collating sequence that does not involve direct integral comparison of a NATIVE character set will slow down your query. Whether it will slow it down noticeably is another issue. Looking up the ranking of this, and the ranking of that, in a table and comparing the two results is much, MUCH faster than retrieving on-disk information from a database, wouldn't you imagine?

Is it safe to convert the charset of a TEXT column from utf8 to utf8mb4 in mysql?

I am trying to convert the charset of a TEXT column in a huge production database from utf8 to utf8mb4 to support emojis.
I have read that for varchar columns we need to calculate and provide a different size in the alter command. But I couldn't find anything about TEXT columns.
TEXT columns are stored off the table so can I go ahead with the alter command or is there anything to be considered?
Not a problem.
The "different size" may refer to changing VARCHAR(255) to VARCHAR(191) to fit within the 767 byte limit for indexes. That is not relevant for TEXT.
How ere you planning on doing the conversion? I think (but have not tested) this will work:
ALTER TABLE tbl CONVERT TO CHARACTER SET utf8mb4;