April 1, 2009 at 8:05 am
I notice the sometimes varchar column lengths are set to a length in integral length in bytes tthat is a power of 2 -1 one byte. (varchar(255) rather a 256) . I know that the 1st byte of a varchar less than 256 contains the actual length of the string (and that above 256, the 1st two bytes).
Is there any advantage in allowing for these length bytes in the lengths of string types?
April 1, 2009 at 8:11 am
That might be an old habit of people who worked with earlier versions of SQL Server. The 255 was the maximum supported by the server.
---------------------------------------------
[font="Verdana"]Nothing is impossible.
It is just a matter of time and money.[/font]
April 6, 2009 at 9:07 pm
Jacek0 is right. There is no performance advantage to a given maximum length for varchar columns. In fact, for very short columns the overhead of using varchar might outweigh the "unused" bytes of a char column. Using varchar(x), where x is some integer, essentially amounts to a maximum length constraint. This can be useful but, as stated before, the maximum length doesn't buy you any performance gains.
__________________________________________________
Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
Stop, children, what's that sound? Everybody look what's going down. -- Stephen Stills
April 7, 2009 at 12:05 pm
Hi
The 2^n-1 logic comes from old systems because the last character was a 0-character which terminated the string. New systems usually do not use this termination and are optimized to handle any size of strings.
Greets
Flo
April 7, 2009 at 2:27 pm
Some systems used first byte to hold number of characters of string with a maximum of 255 bytes.
N 56°04'39.16"
E 12°55'05.25"
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply