June 5, 2009 at 9:04 am
One character is equal to one byte and one unicode character is equal to two bytes. The question specifies one character of data that will never be null or empty. The only possible correct answer is char(1) - ask anyone who ever referred to a chunk of data as a tydbit, nibble, page, or chomp - a unicode token is two characters worth of data.
Joshua Perry
http://www.greenarrow.net
June 5, 2009 at 9:05 am
I think you could make an argument for any of the answers. There wasn't enough information given in the question to choose a truly clear-cut best answer. char(1) and varchar(1) would both be fine given the specs and they take up less space. nchar(1) and nvarchar(1) would both be fine and they allow for Unicode characters. Not enough info in the question.
June 5, 2009 at 9:10 am
Josh Turner (6/5/2009)
I think you could make an argument for any of the answers. There wasn't enough information given in the question to choose a truly clear-cut best answer. char(1) and varchar(1) would both be fine given the specs and they take up less space. nchar(1) and nvarchar(1) would both be fine and they allow for Unicode characters. Not enough info in the question.
Not true.
char(1) - 1 byte
nchar(1) - 2 bytes (but supports a single Unicode character)
varchar(1) - 3 bytes (1 byte + 2 overhead bytes)
nvarchar(1) - 4 bytes (2 bytes for unicode + 2 bytes for overhead).
Because the author didn't specify if the single character was an ASCII character or a Unicode one, nchar(1) is the better choice since it covers for both. varchar or nvarchar are bad choices because the overhead they bring for storage of a single character that will never be null or blank or anything but a character of some sort is too much for a single character length field.
June 5, 2009 at 9:18 am
I don't think any of the answers that include varchar or nvarchar can be correct.
But the question provoked an interesting discussion.
June 5, 2009 at 9:22 am
Irish Flyer (6/5/2009)
NCHAR(1) should not be used, because it is, again, an unnecessary overhead. ASCII character columns, when used with UNICODE column data in T-SQL, are automatically converted for comparison logic. The only time you should use NCHAR is when you know with absolute certainty that you will be storing characters that do not have equivalence in the ASCII code set. That is a special condition and the question was posed generally.Unless you work for a multi-national company and actually need it, the use of UNICODE is a huge waste of resources.
Sorry, totally disagree - this is not about where you work, but rather how the question was phrased. We were told one character of data, we were not told whether that character could be an arabic letter, or cyrillic, or anything else; Without that information, you have no basis to say that the additional byte overhead of NChar(1) is unnecessary. Additionally, in this internet age, many/most public systems need to cater to people all over the world (this applies to small businesses, just as much as multinationals).
The Best solution is the one that most efficiently matches all situations allowed for by the question, not the one that is most efficient in your particular environment.
http://poorsql.com for T-SQL formatting: free as in speech, free as in beer, free to run in SSMS or on your version control server - free however you want it.
June 5, 2009 at 9:46 am
I agree with Tao. An interesting debate on whether or not to use nchar or char from a ambiguous requirement (quite usual). I still believe that with no further info, nchar is the safest design choice.
Buzz
June 5, 2009 at 9:58 am
I picked nchar(1) for the same reasons many others did.
Question is too vague. Define "best".
Varchar is definitely not the best, since it will actually take more space than char in this case. (As already mentioned.)
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
June 5, 2009 at 9:59 am
I answered varchar(1), because I like to use varchar. I hate CHAR cause of big problem that can lead, and NCHAR NVARCHAR cause of waste of space.
June 5, 2009 at 10:08 am
Actually, varchar(1) is the "best" choice from the standpoint that when people blindly follow advice like this without doing their research, the resulting system will be rife with opportunities for improvement (you know there will be other problems, too). This equals more money in my pocket as I'm called in to fix such systems.
June 5, 2009 at 10:11 am
cs_troyk (6/5/2009)
Actually, varchar(1) is the "best" choice from the standpoint that when people blindly follow advice like this without doing their research, the resulting system will be rife with opportunities for improvement (you know there will be other problems, too). This equals more money in my pocket as I'm called in to fix such systems.
WHAT?!?!?
For best practices on asking questions, please read the following article: Forum Etiquette: How to post data/code on a forum to get the best help[/url]
June 5, 2009 at 10:18 am
Tao Klerks (6/5/2009)
Irish Flyer (6/5/2009)
NCHAR(1) should not be used, because it is, again, an unnecessary overhead. ASCII character columns, when used with UNICODE column data in T-SQL, are automatically converted for comparison logic. The only time you should use NCHAR is when you know with absolute certainty that you will be storing characters that do not have equivalence in the ASCII code set. That is a special condition and the question was posed generally.Unless you work for a multi-national company and actually need it, the use of UNICODE is a huge waste of resources.
Sorry, totally disagree - this is not about where you work, but rather how the question was phrased. We were told one character of data, we were not told whether that character could be an arabic letter, or cyrillic, or anything else; Without that information, you have no basis to say that the additional byte overhead of NChar(1) is unnecessary. Additionally, in this internet age, many/most public systems need to cater to people all over the world (this applies to small businesses, just as much as multinationals).
The Best solution is the one that most efficiently matches all situations allowed for by the question, not the one that is most efficient in your particular environment.
I agree. I chose nchar(1) because the question didn't specify what kinds of characters.
I don't often lobby for points, but in this case I think I'm justified. 🙂
- webrunner
-------------------
A SQL query walks into a bar and sees two tables. He walks up to them and asks, "Can I join you?"
Ref.: http://tkyte.blogspot.com/2009/02/sql-joke.html
June 5, 2009 at 10:24 am
The Best solution is the one that most efficiently matches all situations allowed for by the question, not the one that is most efficient in your particular environment.
I agree. The question was "Which of the following is the best data type to declare that variable?" It did not specify the MOST efficient datatype. A CHAR(1) datatype used in a UNICODE environment would be useless. Absent specifications limiting the table information to ASCII, the "Best" datatype would be one that accomodates a 1 character field in any language, which would be NCHAR(1).
June 5, 2009 at 10:33 am
I agree with all of those that chose NCHAR(1). The question stated one character of data, not one byte. The explanation is wrong as it states one byte. Try to put one UNICODE character (e.g., Chinese, Japanese, etc.) into one byte.
June 5, 2009 at 11:13 am
The question is about the space allocated to a variable like
DECLARE @onechar as varchar(1)
DECLARE @onechar as char(1)
not to a table.
And the space allocated to both is equal.
Pay attention to the question !!!
________________
DBA Cabuloso
Lucas Benevides
June 5, 2009 at 11:22 am
with certainty varchar (1) is wrong. I believe the correct answer is "A" char(1).
Viewing 15 posts - 46 through 60 (of 182 total)
You must be logged in to reply to this topic. Login to reply