I have a table like this, where one column is latin1, the other is UTF-8:
Create Table: CREATE TABLE `names` (
`name_english` varchar(255) character NOT NULL,
`name_chinese` varchar(255) character set utf8 default NULL,
) ENGINE=MyISAM DEFAULT CHARSET=latin1
When I do an insert, I have to type _utf8 before values being inserted into UTF-8 columns:
insert into names (name_english = "hooey", name_chinese = _utf8 "鬼佬");
However, since MySQL should know that name_chinese is a UTF-8 column, it should be able to know to use _utf8 automatically.
Is there any way to tell MySQL to use _utf8 automatically, so when I'm programatically making prepared statements, I don't have to worry about including it with the right parameters?
why not to use UTF-8 for the whole table?
Related
When trying to insert a unicode emoji character (😎) to a MYSQL table, the insert fails due to the error;
Incorrect string value: '\\xF0\\x9F\\x98\\x8E\\xF0\\x9F...' for column 'Title' at row 1
From what I've red about this issue, it's apparently caused by the tables default character set, and possible the columns default character set, being set incorrectly. This post suggests to use utf8mb4, which I've tried, but the insert is still failing.
Here's my table configuration;
CREATE TABLE `TestTable` (
`Id` int(11) NOT NULL AUTO_INCREMENT,
`InsertDate` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP,
`Title` text,
`Description` text,
`Info` varchar(250) CHARACTER SET utf8 DEFAULT NULL,
PRIMARY KEY (`Id`),
KEY `xId_TestTablePK` (`Id`)
) ENGINE=InnoDB AUTO_INCREMENT=2191 DEFAULT CHARSET=utf8mb4;
Note that the Title and Text columns dont have an explicitly stated character set. Initially I had no default table character set, and had these two columns were setup with DEFAULT CHARSET=utf8mb4. However, when I altered the table's default charset to the same, they were removed (presumably because the columns inherit the type from the table?)
Can anyone please help me understand how I can store these unicode values in my table?
Its worth noting that I'm on Windows, trying to perform this insert on the MYSQL Workbench. I have also tried using C# to insert into the database, specifying the character set with CHARSET=utf8mb4, however this returned the same error.
EDIT
To try and insert this data, I am executing the following;
INSERT INTO TestTable (Title) SELECT '😎😎';
Edit
Not sure if this is relevant or not, but my database is also set up with the same default character set;
CREATE DATABASE `TestDB` /*!40100 DEFAULT CHARACTER SET utf8mb4 */;
The connection needs to establish that the client is talking utf8mb4, not just utf8. This involves changing the parameters used at connection time. Or executing SET NAMES utf8mb4 just after connecting.
I have a MySQL Database (myDB; ~2GB in size) with 4 Tables (tab1, tab2, tab3, tab4). Currently, the data that is stored in the tables was added using the charset ISO-8859-1 (i.e. Latin-1).
I'd like to convert the data in all tables to UTF-8 and use UTF-8 as default charset of the tables/database/columns.
On https://blogs.harvard.edu/djcp/2010/01/convert-mysql-database-from-latin1-to-utf8-the-right-way/ I found an interesting approach:
mysqldump myDB | sed -i 's/CHARSET=latin1/CHARSET=utf8/g' | iconv -f latin1 -t utf8 | mysql myDB2
I haven't tried it yet, but are there any caveats?
Is there a way to do it directly in the MySQL shell?
[EDIT:]
Result of SHOW CREATE TABLE messages; after running ALTER TABLE messages CONVERT TO CHARACTER SET utf8mb4;
CREATE TABLE `messages` (
`number` int(11) NOT NULL AUTO_INCREMENT,
`status` enum('0','1','2') NOT NULL DEFAULT '1',
`user` varchar(30) NOT NULL DEFAULT '',
`comment` varchar(250) NOT NULL DEFAULT '',
`text` mediumtext NOT NULL,
`date` int(11) NOT NULL DEFAULT '0',
PRIMARY KEY (`number`),
KEY `index_user_status_date` (`user`,`status`,`date`)
) ENGINE=InnoDB AUTO_INCREMENT=3285217 DEFAULT CHARSET=utf8mb4
It is possible to convert the tables. But then you need to convert the application, too.
ALTER TABLE tab1 CONVERT TO utf8mb4;
etc.
To check, do SHOW CREATE TABLE tab1; it should show you CHARACTER SET utf8mb4.
Note: There are 3 things going on:
Convert the encoding of the data in any VARCHAR and TEXT columns.
Change the CHARACTER SET for such columns.
Change the DEFAULT CHARACTER SET for the table -- this comes into play if you add any new columns without specifying a charset.
The application...
When you connect from a client to MySQL, you need to tell it, in a app-specific way or via SET NAMES, the encoding of the bytes in the client. This does not have to be the same as the column declarations; conversion will occur during INSERT and SELECT, if necessary.
I recommend you take a backup and/or test a copy of one of the tables. Be sure to go all the way through -- insert, select, display, etc.
After reading articles about how to properly store your data inside a mysql database, I sort of have a good understanding of how character sets work. Yet, there are still some instances that I find hard to understand. Let me give a quick example: Say I have a table with a varchar column using the latin1 charset (for example purposes)
CREATE TABLE `foobar` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`foo` varchar(100) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=latin1;
and I force insert a latin1 character using its hex value (á for instance, which would be E1)
SET NAMES 'latin1';
INSERT INTO foobar SET foo = UNHEX('E1')
why does a SELECT query against this table under a latin1 connection fail to give me the value I am expecting? I get the replacement character instead. I would have expected it to just give back the value as it is since the column was in latin1, the connection itself was in latin1 and the value stored is a valid latin1 character. Is mysql trying to interpret E1 as a UTF8 value? Or am I missing something here?
Using a UTF8 connection (SET NAMES 'utf8') before selecting gives me the expected value of á, which I guess is correct since from what I understand, if the connection charset is different from the column charset being selected, then mysql would read data as whatever the column charset was (so E1 gets parsed as a latin1 character, or á)
I am using Navicat 11.0.8 and I am trying to insert values into a table using a query but when i try to insert the values with the query it works but the character encoding is messed up!
As you can see on my code, the table is 'table' and i am inserting an ID, VNUM and a NAME. The VNUM is '体字' and the NAME is 'Versão'.
INSERT INTO table VALUES ('1', '体字', 'Versão');
Instead of showing '体字' on the VNUM and 'Versão' on the NAME, it shows '体å—' and 'Versão'.
This is very bad for me because I am trying to insert more than 5000 lines with alot of information.
I have tried to set the Character Encoding of the table using this these commands:
ALTER TABLE table CONVERT TO CHARACTER SET utf8 COLLATE utf8_unicode_ci;
&
ALTER TABLE table CONVERT TO CHARACTER SET big5 COLLATE big5_chinese_ci;
&
ALTER TABLE table CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
I also have tried to delete my table and create a new one with the character encoding already to utf-8 and send the values..
SET FOREIGN_KEY_CHECKS=0;
DROP TABLE IF EXISTS `table`;
CREATE TABLE `table` (
`vnum` int(11) unsigned NOT NULL default '0',
`name` varbinary(200) NOT NULL default 'Noname',
`locale_name` varbinary(24) NOT NULL default 'Noname',
) ENGINE=MyISAM DEFAULT CHARSET=big5_chinese_ci;
INSERT INTO `table` VALUES ('1', '体字', 'Versão');
Still show's '体å—' and 'Versão'.
If i edit the table manually, it show's correct! But i am not going to edit 5000+ lines...
it shows '体å—' and 'Versão'. -- Sounds like you had SET NAMES latin1. Do
SET NAMES utf8;
after connecting and before INSERTing. That tells mysqld what encoding the client is using.
Verify the data stored by doing SELECT col, HEX(col) ... The utf8 (or utf8mb4) hex for 体字 is E4BD93E5AD97. Interpreting those same bytes as latin1 gives you 体å—;
utf8 can handle those, plus ã; I don't know if big5 can.
Actually, I suggest you use utf8mb4 instead of utf8. This is in case you encounter some of the 4-byte Chinese characters.
If you still have latin1 columns that need changing to utf8mb4, see my blog, which discusses the "2-step ALTER", but using BINARY or BLOB (not big5) as the intermediate.
I have a table with a field a using encoding utf8 and collation utf8_unicode_ci:
CREATE TABLE dictionary (
a varchar(128) NOT NULL
) DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
The collation utf8_unicode_ci is required for an efficient case insensitive search with extensions and ligations. For this purpose i have the index:
CREATE INDEX a_idx on dictionary(a);
Problem: Additionally i must ensure that all stored values of the field a are unique but in a case sensitive way.
German example: "blühen" and "Blühen" must both be stored in the table. But adding "Blühen" a second time should not be possible.
Is there a build-in functionality in MySQL to have both?
Unfortunately it seems not to be possible to set the collation for the index in MySQL 5.1.
Solutions to this problem include a uniqueness check before insert or a trigger. Both are far less elegant than using a unique index.
Well, there are 2 ways to accomplish this:
using _bin collation
change your datatype to VARBINARY
Case 1: using _bin collation
Create your table as follows:
CREATE TABLE `dictionary` (
`a` VARCHAR(128) CHARACTER SET utf8 COLLATE utf8_bin NOT NULL,
UNIQUE KEY `idx_un_a` (`a`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
Please note:
the datatype of the column a
the UNIQUE index on column a
Case 2: using VARBINARY dataype
Create your table as follows:
CREATE TABLE `dictionary` (
`a` VARBINARY(128) NOT NULL,
UNIQUE KEY `idx_uniq_a` (`a`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
Please note:
the new datatype VARBINARY
the UNIQUE index on column a
So, both the above will solve your purpose. That is, they both will allow values like 'abc', 'Abc', 'ABC', 'aBc' etc but not allow the same value again if the case matches.
Please note that giving an "_bin" collation is different than using the binary datatype. So please feel free to refer to the following links:
The BINARY and VARBINARY datatypes
The _bin and binary Collations
I hope the above helps!
You can achieve this by adding additinal column 'column_lower'.
CREATE TABLE `dictionary` (
`a` VARCHAR(128) NOT NULL,
`a_lower` VARCHAR(128) NOT NULL,
UNIQUE KEY `idx_un_a_lower` (`a_lower`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin;
Insert that goes like this:
insert into dictionary set a = x, a_lower = lower(x);
Select can now be case-insensitive:
select * from dictionary where a_lower like lower('search_term%')
Note that column which has index on it, can store at max 191 characters. MySQL can have at max 767 bytes long index, that is 767 / 4 (unicode can take up to 4 bytes if you use utf8mb4 collation) = 191.75 = 191 characters. If you use utf8 collation that takes up at max 3 bytes per character column can store at max 767 / 3 = 255 characters.
SELECT * FROM dictionary WHERE a COLLATE utf8_general_ci = 'abc'
Try this It will work .. it worked for me.