According to the mysql documentation:
A column uses one length byte if values require no more than 255 bytes,
two length bytes if values may require
more than 255 bytes
AND
The maximum row size constrains the
number of columns because the total
width of all columns cannot exceed
this size. For example, utf8
characters require up to three bytes
per character, so for a CHAR(255)
CHARACTER SET utf8 column, the server
must allocate 255 × 3 = 765 bytes per
value. Consequently, a table cannot
contain more than 65,535 / 765 = 85
such columns.
For clarity, what then is the maximum value I can set in the varchar argument so it only uses 1 byte to store its length?
From the MySQL documentation:
The CHAR and VARCHAR types are
declared with a length that indicates
the maximum number of characters you
want to store. For example, CHAR(30)
can hold up to 30 characters.
A [VARCHAR] column uses one length
byte if values require no more than
255 bytes, two length bytes if
values may require more than 255
bytes.
This makes the answer to your question depend on the character encoding.
With a single-byte encoding like windows-1252 (which MySQL calls latin1), the character length is the same as the byte length, so you can use a VARCHAR(255).
With UTF-8, a VARCHAR(N) may require up to 3N bytes, as would be the case if all characters were in the range U+0800 to U+FFFF. Thus, a VARCHAR(85) is the greatest that ensures a single-byte byte length (requiring a maximum of 255 bytes).
(Note that MySQL apparently does not support characters outside the BMP. The official definition of UTF-8 allows 4 bytes per character.)
For clarity, what then is the maximum value I can set in the varchar argument so it only uses 1 byte to store its length?
This depends on the collation of the VARCHAR column.
As you noted, UTF8 may use up to three bytes per character, so if your declare a UTF8 column more than 85 characters long, there is a chance that it will use more than 255 bytes to store its data, and the length hence should be stored in a two-byte field.
If you use latin1, each character is stored in 1 byte.
So the answer is:
VARCHAR(85) COLLATE UTF8_GENERAL_CI
, or
VARCHAR(255) COLLATE LATIN1_GENERAL_CI
I think you're confusing string size with character representation.
For instance, you could have a character that takes 4 bytes to represent it, and put it inside of string whose max storage size requires only one byte to hold the length since there's less than 255 characters in it.
Related
I read a lot of articles and SO questions about the topic, but all they did was to confuse me more.
I'm trying to understand what is the longest string that a VARCHAR can hold, and how to define it.
In some places it said that a VARCHAR can be created with a max length of 255 (i.e. VARCHAR(255)) - I don't understand if it means 255 bytes or characters.
In other places it said that a VARCHAR can hold up to 8000 bytes - and then depends on the language (if it's 1-byte per character, such as Latin, or more - that determined the length of the longest string).
In simple terms, what does the n in VARCHAR(n) stands for, and what is the range of n?
Is it bytes? Is it the number of a character? between 0-255? between 0-8000?
How does a really long text is saved? does it get split into multiple columns?
VARCHAR is storing strings in 1 byte per symbol (an opposite to nvarchar that can use 2 or more bytes per character). You can read details here.
A common misconception is to think that CHAR(n) and VARCHAR(n), the n defines the number of characters. But in CHAR(n) and VARCHAR(n) the n defines the string length in bytes (0-8,000). n never defines numbers of characters that can be stored. This is similar to the definition of NCHAR(n) and NVARCHAR(n). The misconception happens because when using single-byte encoding, the storage size of CHAR and VARCHAR is n bytes and the number of characters is also n. However, for multi-byte encoding such as UTF-8, higher Unicode ranges (128-1,114,111) result in one character using two or more bytes. For example, in a column defined as CHAR(10), the Database Engine can store 10 characters that use single-byte encoding (Unicode range 0-127), but less than 10 characters when using multi-byte encoding (Unicode range 128-1,114,111). For more information about Unicode storage and character ranges, see Storage differences between UTF-8 and UTF-16.
What is the best way to store long hex string that has more then 1500 chars?
What is the best type?
Byte
Text
LongText
VarChar
BINARY have fixed padding and a maximum length of 255 byte,
so it's no good for you.
VARCHAR and TEXT have both a maximum length of
65535 bytes (64kB) and uses 2 more extra bytes to store the length of the data.
LONGTEXT have a maximum length of 4294967295 bytes (4GB)
and uses 4 more extra bytes to store the length of the data.
If you know that you will always have less than 64kB of data in your HEX string, I would choose TEXT, otherwise go with LONGTEXT.
If you pack it into bytes (from hex), then VARBINARY(...) or BLOB.
If you keep it as hex, VARCHAR(...) CHARACTER SET ascii or TEXT CHARACTER SET ascii. I mention ascii especially if your default is utf8; there is no need for utf8 for hex digits.
If I create a utf-8 column in MySQL and all the characters are ascii minus 1, how many characters will I be able to store?
Can someone please explain this?
21,844
Will be the maximum for a varchar in utf8.
It's actual chars not bytes.
from MySQL doc
For example, utf8 characters can require up to three bytes per
character, so a VARCHAR column that uses the utf8 character set can be
declared to be a maximum of 21,844 characters. See Section E.7.4,
“Limits on Table Column Count and Row Size”.
MySQL stores VARCHAR values as a 1-byte or 2-byte length prefix plus
data. The length prefix indicates the number of bytes in the value. A
VARCHAR column uses one length byte if values require no more than 255
bytes, two length bytes if values may require more than 255 bytes.
from Wiki
The first 128 characters (US-ASCII) need one byte. The next 1,920
characters need two bytes to encode. This covers the remainder of
almost all Latin alphabets, and also Greek, Cyrillic, Coptic,
Armenian, Hebrew, Arabic, Syriac and Tāna alphabets, as well as
Combining Diacritical Marks. Three bytes are needed for characters in
the rest of the Basic Multilingual Plane (which contains virtually all
characters in common use[11]). Four bytes are needed for characters in
the other planes of Unicode, which include less common CJK characters
and various historic scripts and mathematical symbols.
This is probably a stupid question, but i need to ask...
I've created a MySQL table to handle images called images. In it, I have an attribute that keeps the extension of the image called extension.
Most of the accepted images extensions are either jpg or png or gif or bmp or jpeg or tiff In other words, a maximum of 4 characters long.
Now, should the attribute be declared in the MySQL table like:
extension char(4)
or
extension varchar(4)
There's probably no impact what so ever on performance, but i do want the model to be optimize from the get go...
Anyone?
Depends....
If you look at this from the MySQL documentation
Value CHAR(4) Storage Required VARCHAR(4) Storage Required
'' ' ' 4 bytes '' 1 byte
'ab' 'ab ' 4 bytes 'ab' 3 bytes
'abcd' 'abcd' 4 bytes 'abcd' 5 bytes
'abcdefgh' 'abcd' 4 bytes 'abcd' 5 bytes
As you can see 4 characters for CHAR takes 4 bytes, while VARCHAR takes 5. If the vast majority of extensions would be 4 characters then CHAR would be more space efficient.
In your case I am guessing that 3 will be majority so VARCHAR is the better choice.
James :-)
Edited, I was making a wrong assumption on my previous answer. I'll just paste you an excerpt from http://dev.mysql.com/doc/refman/5.0/en/char.html (emphasis added)
The CHAR and VARCHAR types are similar, but differ in the way they are stored and retrieved. As of MySQL 5.0.3, they also differ in maximum length and in whether trailing spaces are retained.
The CHAR and VARCHAR types are declared with a length that indicates the maximum number of characters you want to store. For example, CHAR(30) can hold up to 30 characters.
The length of a CHAR column is fixed to the length that you declare when you create the table. The length can be any value from 0 to 255. When CHAR values are stored, they are right-padded with spaces to the specified length. When CHAR values are retrieved, trailing spaces are removed.
Values in VARCHAR columns are variable-length strings. The length can be specified as a value from 0 to 255 before MySQL 5.0.3, and 0 to 65,535 in 5.0.3 and later versions. The effective maximum length of a VARCHAR in MySQL 5.0.3 and later is subject to the maximum row size (65,535 bytes, which is shared among all columns) and the character set used.
In contrast to CHAR, VARCHAR values are stored as a one-byte or two-byte length prefix plus data. The length prefix indicates the number of bytes in the value. A column uses one length byte if values require no more than 255 bytes, two length bytes if values may require more than 255 bytes.
How many characters can a SQL Server 2008 database field contain when the data type is VARCHAR(MAX)?
From http://msdn.microsoft.com/en-us/library/ms176089.aspx
varchar [ ( n | max ) ]
Variable-length, non-Unicode character
data. n can be a value from 1 through
8,000. max indicates that the maximum
storage size is 2^31-1 bytes. The
storage size is the actual length of
data entered + 2 bytes. The data
entered can be 0 characters in length.
The ISO synonyms for varchar are char
varying or character varying.
1 character = 1 byte. And don't forget 2 bytes for the termination. So, 2^31-3 characters.
For future readers who need this answer quickly:
2^31-1 = 2 147 483 647 characters, or roughly 2.147 billion
See the MSDN reference table for maximum numbers/sizes.
Bytes per varchar(max),
varbinary(max), xml, text, or image
column: 2^31-1
There's a two-byte overhead for the column, so the actual data is 2^31-3 max bytes in length. Assuming you're using a single-byte character encoding, that's 2^31-3 characters total. (If you're using a character encoding that uses more than one byte per character, divide by the total number of bytes per character. If you're using a variable-length character encoding, all bets are off.)
There are a few gotchas worth mentioning - you may need to force the use of varchar(max)
https://dba.stackexchange.com/questions/18483/varcharmax-field-cutting-off-data-after-8000-characters
and print only handles 8000 chars
How to print VARCHAR(MAX) using Print Statement?