MYSQL - Storing unicode characters (emoji) in TEXT column - mysql

When trying to insert a unicode emoji character (😎) to a MYSQL table, the insert fails due to the error;
Incorrect string value: '\\xF0\\x9F\\x98\\x8E\\xF0\\x9F...' for column 'Title' at row 1
From what I've red about this issue, it's apparently caused by the tables default character set, and possible the columns default character set, being set incorrectly. This post suggests to use utf8mb4, which I've tried, but the insert is still failing.
Here's my table configuration;
CREATE TABLE `TestTable` (
`Id` int(11) NOT NULL AUTO_INCREMENT,
`InsertDate` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP,
`Title` text,
`Description` text,
`Info` varchar(250) CHARACTER SET utf8 DEFAULT NULL,
PRIMARY KEY (`Id`),
KEY `xId_TestTablePK` (`Id`)
) ENGINE=InnoDB AUTO_INCREMENT=2191 DEFAULT CHARSET=utf8mb4;
Note that the Title and Text columns dont have an explicitly stated character set. Initially I had no default table character set, and had these two columns were setup with DEFAULT CHARSET=utf8mb4. However, when I altered the table's default charset to the same, they were removed (presumably because the columns inherit the type from the table?)
Can anyone please help me understand how I can store these unicode values in my table?
Its worth noting that I'm on Windows, trying to perform this insert on the MYSQL Workbench. I have also tried using C# to insert into the database, specifying the character set with CHARSET=utf8mb4, however this returned the same error.
EDIT
To try and insert this data, I am executing the following;
INSERT INTO TestTable (Title) SELECT '😎😎';
Edit
Not sure if this is relevant or not, but my database is also set up with the same default character set;
CREATE DATABASE `TestDB` /*!40100 DEFAULT CHARACTER SET utf8mb4 */;

The connection needs to establish that the client is talking utf8mb4, not just utf8. This involves changing the parameters used at connection time. Or executing SET NAMES utf8mb4 just after connecting.

Related

MySQL. Can't insert russian characters into column with a type VARCHAR(45)

I have a problem when trying to execute this line in MySQL (Workbench):
INSERT INTO classification (`Type`, `Subtype`) VALUES ("тип", "подтип");
I have tried to set different charsets for table classification : cp1251, utf-8, utf8mb4, cp1251_bin.
This is a table with all charsets in my database that I have found, maybe it will help you:
UPD. I have found a solution. However, I had to change my table, so now the table risk is an edited table classification. The result of SHOW CREATE TABLE risk is:
'CREATE TABLE `risk` (
`IdRisk` int(11) NOT NULL AUTO_INCREMENT,
`IdSubtype` int(11) DEFAULT NULL,
`Content` varchar(4000) CHARACTER SET utf8 DEFAULT NULL,
PRIMARY KEY (`IdRisk`),
KEY `FK_subtype_risk_idx` (`IdSubtype`),
CONSTRAINT `FK_subtype_risk` FOREIGN KEY (`IdSubtype`) REFERENCES `subtype` (`IdSubtype`) ON DELETE SET NULL ON UPDATE CASCADE
) ENGINE=InnoDB AUTO_INCREMENT=48 DEFAULT CHARSET=latin1'
Can't find the solution to this issue. I'm hope that someone knows a solution to it.
Thank You!
The CHARACTER SET for the table is the default for columns in the table. Please provide SHOW CREATE TABLE so we can verify what the columns are set to.
What is the encoding of the bytes in the client? cp1251 is different than utf8; utf8mb4 == utf8 for Russian.
In what way are things bad? Based on the symptom, see this for specific tips on what else might be set incorrectly.
Perhaps it was your change to NVARCHAR that forced CHARACTER SET utf8 on the columns?

How to migrate MySQL database from Latin-1 to UTF-8?

I have a MySQL Database (myDB; ~2GB in size) with 4 Tables (tab1, tab2, tab3, tab4). Currently, the data that is stored in the tables was added using the charset ISO-8859-1 (i.e. Latin-1).
I'd like to convert the data in all tables to UTF-8 and use UTF-8 as default charset of the tables/database/columns.
On https://blogs.harvard.edu/djcp/2010/01/convert-mysql-database-from-latin1-to-utf8-the-right-way/ I found an interesting approach:
mysqldump myDB | sed -i 's/CHARSET=latin1/CHARSET=utf8/g' | iconv -f latin1 -t utf8 | mysql myDB2
I haven't tried it yet, but are there any caveats?
Is there a way to do it directly in the MySQL shell?
[EDIT:]
Result of SHOW CREATE TABLE messages; after running ALTER TABLE messages CONVERT TO CHARACTER SET utf8mb4;
CREATE TABLE `messages` (
`number` int(11) NOT NULL AUTO_INCREMENT,
`status` enum('0','1','2') NOT NULL DEFAULT '1',
`user` varchar(30) NOT NULL DEFAULT '',
`comment` varchar(250) NOT NULL DEFAULT '',
`text` mediumtext NOT NULL,
`date` int(11) NOT NULL DEFAULT '0',
PRIMARY KEY (`number`),
KEY `index_user_status_date` (`user`,`status`,`date`)
) ENGINE=InnoDB AUTO_INCREMENT=3285217 DEFAULT CHARSET=utf8mb4
It is possible to convert the tables. But then you need to convert the application, too.
ALTER TABLE tab1 CONVERT TO utf8mb4;
etc.
To check, do SHOW CREATE TABLE tab1; it should show you CHARACTER SET utf8mb4.
Note: There are 3 things going on:
Convert the encoding of the data in any VARCHAR and TEXT columns.
Change the CHARACTER SET for such columns.
Change the DEFAULT CHARACTER SET for the table -- this comes into play if you add any new columns without specifying a charset.
The application...
When you connect from a client to MySQL, you need to tell it, in a app-specific way or via SET NAMES, the encoding of the bytes in the client. This does not have to be the same as the column declarations; conversion will occur during INSERT and SELECT, if necessary.
I recommend you take a backup and/or test a copy of one of the tables. Be sure to go all the way through -- insert, select, display, etc.

Understanding character sets in column and data and how they affect select queries

After reading articles about how to properly store your data inside a mysql database, I sort of have a good understanding of how character sets work. Yet, there are still some instances that I find hard to understand. Let me give a quick example: Say I have a table with a varchar column using the latin1 charset (for example purposes)
CREATE TABLE `foobar` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`foo` varchar(100) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=latin1;
and I force insert a latin1 character using its hex value (á for instance, which would be E1)
SET NAMES 'latin1';
INSERT INTO foobar SET foo = UNHEX('E1')
why does a SELECT query against this table under a latin1 connection fail to give me the value I am expecting? I get the replacement character instead. I would have expected it to just give back the value as it is since the column was in latin1, the connection itself was in latin1 and the value stored is a valid latin1 character. Is mysql trying to interpret E1 as a UTF8 value? Or am I missing something here?
Using a UTF8 connection (SET NAMES 'utf8') before selecting gives me the expected value of á, which I guess is correct since from what I understand, if the connection charset is different from the column charset being selected, then mysql would read data as whatever the column charset was (so E1 gets parsed as a latin1 character, or á)

MySQL won't properly GROUP BY on emojis

I'm storing single emojis in a CHAR column in a MySQL database. The column's encoding is utf8mb4.
When I run this aggregate query, MySQL won't group by the emoji characters. It instead returns a single row with a single emoji and the count of all the rows in the database.
SELECT emoji, count(emoji) FROM emoji_counts GROUP BY emoji
Here's my table definition:
CREATE TABLE `emoji_counts` (
`id` int(11) unsigned NOT NULL AUTO_INCREMENT,
`emoji` char(1) DEFAULT '',
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
Is there some special Unicode behavior I'll have to account for?
Turns out I needed to specify an expanded collation in the query, namely utf8mb4_unicode_520_ci.
This worked:
SELECT emoji, count(emoji) FROM emoji_counts group by emoji collate utf8mb4_unicode_520_ci;
EDIT: That collation isn't available on some server configs (including ClearDB's)... utf8mb4_bin also appears to work.

Can MySQL automatically specify `_utf8` for inserts to UTF-8 columns?

I have a table like this, where one column is latin1, the other is UTF-8:
Create Table: CREATE TABLE `names` (
`name_english` varchar(255) character NOT NULL,
`name_chinese` varchar(255) character set utf8 default NULL,
) ENGINE=MyISAM DEFAULT CHARSET=latin1
When I do an insert, I have to type _utf8 before values being inserted into UTF-8 columns:
insert into names (name_english = "hooey", name_chinese = _utf8 "鬼佬");
However, since MySQL should know that name_chinese is a UTF-8 column, it should be able to know to use _utf8 automatically.
Is there any way to tell MySQL to use _utf8 automatically, so when I'm programatically making prepared statements, I don't have to worry about including it with the right parameters?
why not to use UTF-8 for the whole table?