Why MYSQL Desc command showing char instead of nchar? - mysql

I have created two columns in a tables with NATIONAL VARCHAR Data Type? But, when i am running the command,
desc
The result shows the datatype as varchar, not nvarchar. Why is it so? Am i doing some thing wrong?
PENAME NATIONAL VARCHAR(255),
PNAME NATIONAL VARCHAR(255) NOT NULL,

MySQL does not have separation of nvarchar and varchar (like SQL Server).
Instead, you define the CHARACTER SET (aka CHARSET) to describe the data (ansi, utf8 etc) therein. This is orthoganal to collation in MySQL (collation defines how the data is sorted and compared)
NATIONAL VARCHAR (n) is just a synonym for VARCHAR (n) CHARACTER SET utf8

Related

Understanding character sets in column and data and how they affect select queries

After reading articles about how to properly store your data inside a mysql database, I sort of have a good understanding of how character sets work. Yet, there are still some instances that I find hard to understand. Let me give a quick example: Say I have a table with a varchar column using the latin1 charset (for example purposes)
CREATE TABLE `foobar` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`foo` varchar(100) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=latin1;
and I force insert a latin1 character using its hex value (á for instance, which would be E1)
SET NAMES 'latin1';
INSERT INTO foobar SET foo = UNHEX('E1')
why does a SELECT query against this table under a latin1 connection fail to give me the value I am expecting? I get the replacement character instead. I would have expected it to just give back the value as it is since the column was in latin1, the connection itself was in latin1 and the value stored is a valid latin1 character. Is mysql trying to interpret E1 as a UTF8 value? Or am I missing something here?
Using a UTF8 connection (SET NAMES 'utf8') before selecting gives me the expected value of á, which I guess is correct since from what I understand, if the connection charset is different from the column charset being selected, then mysql would read data as whatever the column charset was (so E1 gets parsed as a latin1 character, or á)

Is it ok to use VARCHAR (5000) field in MYSQL for given scenario?

I am developing a classified website using ASP.NET and DB is MYSQL.
I have a header table for store common details of ads.
So here is my header table's database schema.
CREATE TABLE `test`.`header` (
`header_id` BIGINT UNSIGNED NOT NULL AUTO_INCREMENT,
`title` VARCHAR (500) NOT NULL,
`description` VARCHAR (5000) NOT NULL,
`is_published` TINYINT (1) NOT NULL DEFAULT TRUE,
//etc..
PRIMARY KEY (`header_id`)
) ENGINE = INNODB CHARSET = latin1 COLLATE = latin1_swedish_ci ;
So I am using varchar(500) for title and varchar(5000) for description. So is it OK to use varchar 5000? Reason why I am asking this is some people are saying long varchar fields are converted to Text field inside MYSQL ( I dont know about this). How much is this long? Also some people are saying there is a limitation in row size. So is varchar(5000) field will lead to any performance issue?
Yes I can use Text field but remember I want a limitation for the description. otherwise users will copy paste a novel to description field. :)
What is your suggestion? Another data type or anything....
Thank you very much.
Assuming that 5000 characters is your limitation, then VARCHAR(5000) is perfectly reasonable.
Take a look at this question if you are curious about the differences between VARCHAR and TEXT: MySQL: Large VARCHAR vs. TEXT?.

Create database, what's the right charset for my purpose

I have just a little information about MySql. I just need to create a database to store some score of a videogame, taken from all over the world. (The game will be in every available store, also Chinese etc.)
I'm worried about the charset. Db schema's will be similar to (pseudocode):
leaderboard("PhoneId" int primary key, name varchar(50), score smallint);
What will happen if a chinese guy will put his score with a name with that characters? Should I specify something into db creation script?
create database if not exists "test_db";
create table if not exists "leaderboard" (
"phoneid" integer unsigned NOT NULL,
"name" varchar(20) NOT NULL, -- Gestione errori per questo
"score" smallint unsigned NOT NULL default 0,
"timestamp" timestamp NOT NULL default CURRENT_TIMESTAMP,
PRIMARY KEY ("phoneid")
);
UTF8 is your obvious choice.
For details on UTF8 and MySQL integration, you can go through the Tutorial pages:
http://dev.mysql.com/doc/refman/5.0/en/charset-unicode-utf8.html
http://dev.mysql.com/doc/refman/5.0/en/charset-unicode.html
There are certain things that needs to be kept in mind while using the UTF8 charset in any database. For example, To save space with UTF-8, use VARCHAR instead of CHAR. Otherwise, MySQL must reserve three bytes for each character in a CHAR CHARACTER SET utf8 column because that is the maximum possible length.
Similarly you should analyze other performance constraints and design your database.

How to convert MYSQL UTF-8 when a row size is too large?

I'm on a MYSQL database attempting to alter my table encoded in latin1 to UTF-8. The tables name is Journalist that has 12 columns listed as varchar with max length of 3000. This is the sql i'm running.
ALTER TABLE `journalist` CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
The error I'm receiving
Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs
Do I have to alter the size of the table before I run this conversion query?
and/or how I might accomplish this encoding alteration otherwise?
I did what #Wrikken suggested. I deleted my table and lowered varchar's max_length attributes to 1500 from 3000. I then ran this SQL on my new empty table
ALTER TABLE `table_name` CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
from here, I repopulated it with my backup table, using a script.
To answer the question:
Lower varchar max_length limits
Or change varchar fields to LONGTEXT, or BLOBS

mysql should I use char for unicode?

I'm just newbie in mysql.
I want to declare my table something like this,
CREATE TABLE `test` (
`NO` bigint(20) unsigned NOT NULL AUTO_INCREMENT,
`ID` char(50) CHARACTER SET utf8 NOT NULL ,
`FIRST_NAME` char(50)CHARACTER SET utf8 NOT NULL,
`LAST_NAME` smallint(50)char(50)CHARACTER SET utf8 NOT NULL,
PRIMARY KEY (`NO`),
)
I read in mysql about char and varchar that
The length of a CHAR column is fixed to the length that you declare when you create the table.
My question is that if I use unicode char(50) for chinese, japanese, korean or other unicode characters, would these columns use too much storage in database and can affect the performance?
For english characters, it only can accept up to 50 characters if I declare my table like this?
Is there any better way or is it good to use char(100) for unicode?
Correct me if I'm wrong.
According to the MySQL manual, the number of bytes required by a CHAR(M) column is M x w, where w is "the number of bytes required for the maximum-length character in the character set". This clearly suggests that a CHAR(50) column will store 50 characters, not 50 bytes. If your characters happen to be Chinese, etc., it will still store up to 50 of them. Note that a VARCHAR(50) will also store up to 50 characters, but will (if one ignores the 1-2 byte overhead that comes with a VARCHAR column) take up less storage than a CHAR(50) column when there are fewer than 50 characters stored.
EDIT This applies only if you are using MySQL 5.5 or later. Earlier versions interpreted the length of character and text fields as bytes, not characters.