Are there any downside effects when changing mysql table encoding? - mysql

I have a table encoded in latin1 and collate latin1_bin.
In my table there is a column comments of type 'TEXT', as you know this column inherits table's encoding and collation, but from now on I should change it to be utf8 and utf8_general_ci because I'mย starting to store special characters in comments.
Would it cause any downside effect if I'd use a command like the following?
alter table notebooks modify comments text CHARACTER SET utf8 COLLATE utf8_general_ci;
Thank you for your answer.

Danger I think that that ALTER will destroy existing text.
Also, ... Your 'name' looks Chinese, so I would guess that you want to store Chinese characters? In that case, you should use utf8mb4, not just utf8. This is because some of the Chinese characters take 4 bytes (and are not in the Unicode BMP).
I believe you need 2 steps:
ALTER TABLE notebooks MODIFY comments BLOB;
ALTER TABLE notebooks MODIFY comments TEXT
CHARACTER SET utf8mb4 COLLATE utf8mb4_general_520_ci;
Otherwise the latin1 characters will be "converted" to ut8. But if you really have Chinese in the column, you do not have latin1. The 2-step alter, above, does (1) turn off any knowledge of character set, and (2) establish that the bytes are really utf8mb4-encoded.
To be safer, first do
RENAME TABLE notebooks TO old;
CREATE TABLE notebooks LIKE old;
INSERT INTO notebooks SELECT * FROM old;
Then do the two ALTERs and test the result. If there is trouble, you can RENAME to get back the old copy.

Specifying any collating sequence that does not involve direct integral comparison of a NATIVE character set will slow down your query. Whether it will slow it down noticeably is another issue. Looking up the ranking of this, and the ranking of that, in a table and comparing the two results is much, MUCH faster than retrieving on-disk information from a database, wouldn't you imagine?

Related

Incorrect string value error for unconventional characters

So I'm using a wrapper to fetch user data from instagram. I want to select the display names for users, and store them in a MYSQL database. I'm having issues inserting some of the display names, dealing with, specifically, an incorrect string value error:
Now, I've dealt with this issue before with accent marks, letters with umlauts, etc. The solution would be to change the collation to utf8_general_ci under the utf8 charset.
So as you can see, some of the display names I'm pulling have very unique characters that I'm not sure mySQL can recognize at all, i.e.:
แ›˜๐•ฐ๐–†๐–—๐–™๐– ๐•พ๐–•๐–Ž๐–—๐–Ž๐–™๐–š๐–˜๐‚‚ยฎ
So I receive:
Error Code: 1366. Incorrect string value: '\xF0\x9D\x99\x87\xF0\x9D...' for column 'dummy' at row 1
Here's my sql code
CREATE TABLE test_table(
id INT AUTO_INCREMENT,
dummy VARCHAR(255),
PRIMARY KEY(id)
);
INSERT INTO test_table (dummy)
VALUES ('แ›˜๐•ฐ๐–†๐–—๐–™๐– ๐•พ๐–•๐–Ž๐–—๐–Ž๐–™๐–š๐–˜๐‚‚ยฎ');
Any thoughts on a proper charset + collation pair that can handle characters like this? Not sure where to look for a solution, so I come here to see if anyone dealt with this.
P.S., I've tried utf8mb4 charset with utf8mb4_unicode_ci and utf8mb4_bin collations as well.
The characters you show require the column use the utf8mb4 encoding. Currently it seems your column is defined with the utf8mb3 encoding.
The way MySQL uses the name "utf8" is complicated, as described in https://dev.mysql.com/doc/refman/8.0/en/charset-unicode-utf8mb3.html:
Note
Historically, MySQL has used utf8 as an alias for utf8mb3;
beginning with MySQL 8.0.28, utf8mb3 is used exclusively in the output
of SHOW statements and in Information Schema tables when this
character set is meant.
At some point in the future utf8 is expected to become a reference to
utf8mb4. To avoid ambiguity about the meaning of utf8, consider
specifying utf8mb4 explicitly for character set references instead of
utf8.
You should also be aware that the utf8mb3 character set is deprecated
and you should expect it to be removed in a future MySQL release.
Please use utf8mb4 instead.
You may have tried to change your table in the following way:
ALTER TABLE test_table CHARSET=utf8mb4;
But that only changes the default character set, to be used if you add new columns to the table subsequently. It does not change any of the current columns. To do that:
ALTER TABLE test_table MODIFY COLUMN dummy VARCHAR(255) CHARACTER SET utf8mb4;
Or to convert all string or TEXT columns in a table in one statement:
ALTER TABLE test_table CONVERT TO CHARACTER SET utf8mb4;
That would be ๐™‡ - L MATHEMATICAL SANS-SERIF BOLD ITALIC CAPITAL L
It requires the utf8mb4 Character set to even represent it. "F0" is the clue; it is the first of 4 bytes in a 4-byte UTF-8 character. It cannot be represented in MySQL's "utf8". Collation is (mostly) irrelevant.
Most, not all, of the characters in แ›˜๐•ฐ๐–†๐–—๐–™๐– ๐•พ๐–•๐–Ž๐–—๐–Ž๐–™๐–š๐–˜๐‚‚ยฎ also need utf8mb4. They are "MATHEMATICAL BOLD FRAKTUR" letters.
(Meanwhile, Bill gives you more of an answer.)

MySQL: data being mangled while changing column to UTF8

I am migrating a MySQL database that was created in LATIN1 to UTF8. To do that, I am first changing each column to the corresponding binary type and then to UTF8:
ALTER TABLE clientes CHARACTER SET utf8;
ALTER TABLE clientes change nombre nombre varbinary(255);
ALTER TABLE clientes change nombre nombre varchar(255) character set utf8;
Since, according to all docs, that's the right way to prevent data from being mangled...
...However, the data still gets mangled. I'll just put two examples:
The word "Larrasoaรฑa" gets truncated at the "รฑ", with the error: Warning: #1366 Incorrect string value: '\xF1a' for column 'nombre'
The word "Jesรบs y Marรญa" gets truncated at the "รบ", with the error Warning: #1366 Incorrect string value: '\xFAs y M...' for column 'nombre'
How did that data get in? Well, the DB is the backend to a PHP web app, which used UTF8 for everything (including connecting to the MySQL server with "SET NAMES UTF8")... except creating the database properly. So I assume all the data that was added in was in UTF8.
To summarize: it seems that I had UTF8 text stored in LATIN1 columns, and now that I try to change the columns to UTF8, the text gets truncated.
Why is this happening? What can I do?
EDIT: forgot to mention, I'm doing all of this from PhpMyAdmin, since I don't have command line access.
F1 and FA are latin1 encodings. You need to tell MySQL that the data is latin1. One way is via SET NAMES latin1.
But note... That is independent of the setting for the column you are trying to store the data into. And, these days, utf8mb4 is the preferred setting for text. MySQL will convert between the column's encoding and the client's encoding. But you must tell it the client's encoding via connection parameters (or SET NAMES).
The pair of ALTER TABLEs works for certain situations, not all situations! You probably wanted the first entry in http://mysql.rjweb.org/doc.php/charcoll#fixes_for_various_cases
Table is CHARACTER SET latin1 and correctly encoded in latin1; want
utf8mb4:
ALTER TABLE tbl CONVERT TO CHARACTER SET utf8mb4;
I don't happen to know if your data is irreparably hosed. Please provide one of the lines, together with HEX.
Hex
"Larrasoaรฑa" is encoded as 4C61727261736F61F161, and "Jesรบs y Marรญa" as 4A6573FA732079204D6172ED6120
Those are latin1-encoded (or latin5 or dec8). If the table definition (SHOW CREATE TABLE) says latin1, then you could leave it alone. (latin1 handles Western European languages, but not Asian.)
If you want to convert all the text columns to utf8 or utf8mb4, do an ALTER like the one I presented above. Your 3-Alter approach will not work correctly; it assumes the bytes in the latin1 column are really UTF-8 bytes (which they aren't).
But... You must specify the client's encoding based on what the client wants. And it does not matter whether the client and the table agree since conversion will be provided.
Why the 3-step Alter fails
ALTER TABLE clientes CHARACTER SET utf8; -- This sets the default charset for new columns. It has no effect on the existing column definitions and any data in those columns.
ALTER TABLE clientes change nombre nombre varbinary(255); -- This says "forget about any text encoding". That is F1 is now just a bunch of bits, not the latin1 representation for รฑ.
ALTER TABLE clientes change nombre nombre varchar(255) character set utf8; -- This takes those varbinary bits and says "let's treat them as utf8. And that gives the error message because F1 is not a valid encoding for utf8.
That procedure is appropriate if the bytes are already utf8 bytes. That is, if it were already the 2-byte C3B1 for รฑ. (By the way, this usually manifests itself as 'Mojibake', displaying as รƒยฑ when interpreted as latin1.)
The 1-Alter procedure...
ALTER TABLE clientes CONVERT TO CHARACTER SET utf8; (to convert the entire table) or ALTER TABLE clientes MODIFY nombre varchar(255) character set utf8; (to convert just one column). They do the following things:
For each text (char/varchar/text) column, it reads the data according to its current encoding (latin1, F1), converts it to utf8 (or utf8mb4) (C3B1) and writes back into the row. Meanwhile, it has changed the declaration to be CHARACTER SET utf8.
That is, it is the 'right' process for changing the CHARACTER SET without changing the "text". True, the encoding changed (F1 -> C3B1), but that is in keeping with the change to the CHARACTER SET.
Recovery
Your first 2 ALTERs worked, correct? Did the 3rd one succeed, fail, or leave a messed up table?
If it aborted, leaving varbinary in place, then do 2 more alters: First go back to latin1; then go straight to utf8.
If it left you with a messed up column, especially if rows are truncated, then you need to go back to a backup, or otherwise reload the data.

Utf8 and latin1

Half of the tables in a database which has alot of data are set to Latin 1.
Within these Latin 1 tables most rows are set to utf8 if the row is expecting text input (anything not a integer).
Everything is in English .
How bad is my situation if I need to convert these Latin 1 tables to utf8?
First, a clarification: You said "most rows are set to utf8"; I assume you meant "most columns are set to utf8"?
The meaning of the latin1 on the table is just a default. It has no impact on performance, etc.
The only "harm" occurs if you do ALTER TABLE .. ADD COLUMN .. without specifying CHARACTER SET utf8.
You say that all the text is English? Then there is no encoding difference between latin1 and utf8. Problems can occur when you have accented letters, etc.
There is one performance issue: If you JOIN two tables together on a VARCHAR column, but the CHARACTER SET or COLLATION differs for that column in the two tables, it will be slower than if those settings are the same. (It does not sound like you have this problem.) Again, note that the table's default is not relevant, only the column settings themselves.
Yes, it would be 'cleaner' to have the table default set to utf8. This should be a way to do it without dumping, but by changing the tables one at a time:
ALTER TABLE t CONVERT TO CHARACTER SET utf8;
That will change the table default and any columns that are not already utf8.
mysqldump --add-drop-table database_to_correct | replace CHARSET=latin1 CHARSET=utf8 | iconv -f latin1 -t utf8 | mysql database_to_correct

Is it safe to convert the charset of a TEXT column from utf8 to utf8mb4 in mysql?

I am trying to convert the charset of a TEXT column in a huge production database from utf8 to utf8mb4 to support emojis.
I have read that for varchar columns we need to calculate and provide a different size in the alter command. But I couldn't find anything about TEXT columns.
TEXT columns are stored off the table so can I go ahead with the alter command or is there anything to be considered?
Not a problem.
The "different size" may refer to changing VARCHAR(255) to VARCHAR(191) to fit within the 767 byte limit for indexes. That is not relevant for TEXT.
How ere you planning on doing the conversion? I think (but have not tested) this will work:
ALTER TABLE tbl CONVERT TO CHARACTER SET utf8mb4;

Is it possible to enable emojis for specific columns of specific tables?

First, I would like to assure you that I have done my "homework" and have read this, this, this and this. Also, one of my former questions is closely related to this one, but in that question I am dealing about flourishlib's compatibility issues with utf8mb4. This question deals with a deeper level. Let's suppose that I have several tables and I want to modify just a few columns to have utf8mb4 encoding, to preserve some storage space and performance after the change. If I changed the whole database to have an encoding of utf8mb4, then its size would increase with 33%, which will affect its performance badly as well. So, we have selected four columns from three different tables to support emojis. These are:
users.bio (tinytext, utf8_general_ci)
questions.question (longtext, utf8_general_ci)
questions.answer (longtext, ut8_general_ci)
comments.comment (tinytext, utf8_general_ci)
As a result, my action plan is as follows:
Create a backup of the database
Run these commands:
alter table comments change comment comment tinytext character set utf8mb4 collate utf8mb4_unicode_ci;
alter table users change bio bio tinytext character set utf8mb4 collate utf8mb4_unicode_ci;
alter table questions change question question longtext character set utf8mb4 collate utf8mb4_unicode_ci;
alter table questions change answer answer longtext character set utf8mb4 collate utf8mb4_unicode_ci;
Expectations:
this should make the specified columns use utf8mb4 instead of utf8
existent data will be correctly converted to utf8mb4, that is, previous texts will be preserved and the users will be able to correctly read their content
other columns will not be changed
queries involving the affected tables will be slower
Are my expectations accurate? Do I need to change the connection? Thanks
You need utf8mb4 in any columns that are storing Chinese.
In VARCHAR(...) utf8mb4, each "character" takes 1-4 bytes. No 33% increase.
On the other hand, CHAR(10) utf8mb4 is always allocated 40 bytes.
You do need to establish that your client is talking utf8mb4, not just utf8. That comes in some parameter in the connection or with SET NAMES utf8mb4.
If you need to automate the ALTERs, it is pretty easy to generate them via a SELECT into information_schema.
Addenda
Expectations 1-3: Yes.
Expectation 4 (queries involving the affected tables will be slower) -- processing will be essentially the same speed.