January 27, 2015 at 10:35 am
Is a datalength that more closely matches the length of values (say with a generous buffer) any better for performance ? Eg values never exceed 250, so I assign nvarchar(500). Is this better than nvarchar(1000)???
January 27, 2015 at 10:57 am
EDITED:- As was originally incorrect.
It's all about how big the columns will be, from MSDN:-
https://msdn.microsoft.com/en-ie/library/ms186939.aspx
The storage size, in bytes, is two times the actual length of data entered + 2 bytes.
So nvarchar(500) and nvarchar(1000) will only be the size of the (length of the data * 2) + 2
As pointed out in next post, it won't affect RAM usage
January 27, 2015 at 11:05 am
DBA From The Cold (1/27/2015)
It's all about how big the columns will be, from MSDN:-https://msdn.microsoft.com/en-ie/library/ms186939.aspx
The storage size, in bytes, is two times the actual length of data entered + 2 bytes.
So nvarchar(500) will take up (2*500) + 2 = 1002 bytes and nvarchar(1000) will take up (2*1000) + 2 = 2002 bytes. Double (almost) in size. Each time that column is queried, SQL will put the data in RAM (if it's not already there). If you know that your data will never exceed 250 in length, then you are consuming much more RAM than is necessary.
This could place pressure on the amount of RAM available to SQL, which will impact performance.
You are missing word actual.
It does not take any more storage nor RAM to store strings in nvarchar(500) compared to nvarchar(1000). Both are variable length data types.
January 27, 2015 at 11:08 am
Ville-Pekka Vahteala (1/27/2015)
DBA From The Cold (1/27/2015)
It's all about how big the columns will be, from MSDN:-https://msdn.microsoft.com/en-ie/library/ms186939.aspx
The storage size, in bytes, is two times the actual length of data entered + 2 bytes.
So nvarchar(500) will take up (2*500) + 2 = 1002 bytes and nvarchar(1000) will take up (2*1000) + 2 = 2002 bytes. Double (almost) in size. Each time that column is queried, SQL will put the data in RAM (if it's not already there). If you know that your data will never exceed 250 in length, then you are consuming much more RAM than is necessary.
This could place pressure on the amount of RAM available to SQL, which will impact performance.
You are missing word actual.
It does not take any more storage nor RAM to store strings in nvarchar(500) compared to nvarchar(1000). Both are variable length data types.
Apologies, you are completely correct. Please disregard my previous answer, I've corrected it (so not to leave inaccurate information there).
January 27, 2015 at 11:08 am
yes, there are many posts that say it doesn't matter which I why I asked here.
But if that is the case, why is there an option to specify length for nvarchar at all, except when it's a blob in which case specify nvarchar(max).
Don't get why there's an option to set length if it doesn't matter.
January 27, 2015 at 11:10 am
My concern would be the total number of columns you are defining with this additional buffer space in a single record. Also, if you know the data does not exceed 250 characters (500 bytes in Unicode data) why declare the extra space?
January 27, 2015 at 11:21 am
KoldCoffee (1/27/2015)
yes, there are many posts that say it doesn't matter which I why I asked here.But if that is the case, why is there an option to specify length for nvarchar at all, except when it's a blob in which case specify nvarchar(max).
Don't get why there's an option to set length if it doesn't matter.
From one perspective business rules, for example there's some fields you'll never want to exceed a certain size.
Second think about what can happen potentially happen to the size of a record if you say had a large number of varchar(5000) fields on it and people started changing the data across that entire potential range?
January 27, 2015 at 11:27 am
I'm not sure that it doesn't matter as I discovered when testing a function in this thread:
http://www.sqlservercentral.com/Forums/FindPost1531822.aspx
The best option that you have is to test both approaches. I'd do it myself but I'm busy right now.
January 27, 2015 at 11:35 am
Functions aside. This is not a question asked in the context of functions to which I might want to pass parameters which need to specify datatype and length.
I am speaking in the context of building a database with tables that will later be indexed and queried against. Considerations are database storage space, indexing, caching...that sort of thing.
January 27, 2015 at 12:14 pm
KoldCoffee (1/27/2015)
Functions aside. This is not a question asked in the context of functions to which I might want to pass parameters which need to specify datatype and length.I am speaking in the context of building a database with tables that will later be indexed and queried against. Considerations are database storage space, indexing, caching...that sort of thing.
In this case, define the columns to be no longer than the longest the data will be. If the data in a given column is specified to be no longer than 250 characters (regardless of varchar or nvarchar) define it as 250 characters.
January 27, 2015 at 12:27 pm
OK, will do.
But I think the only 'gain' here is that it prevents insertion of data that is way out of scope of what is expected. No other reasons to do so.
January 27, 2015 at 12:52 pm
KoldCoffee (1/27/2015)
OK, will do.But I think the only 'gain' here is that it prevents insertion of data that is way out of scope of what is expected. No other reasons to do so.
Well yes... but that has both business impact and performance impact.
January 27, 2015 at 1:36 pm
Well yes... but that has both business impact and performance impact.
ZZ, please clarify 'performance' impact.
January 27, 2015 at 2:05 pm
KoldCoffee (1/27/2015)
Well yes... but that has both business impact and performance impact.
ZZ, please clarify 'performance' impact.
In general the more information you store and retrieve the slower it will be. Now while you might be right that if you never go over whatever the expected value is there won't be an impact think about what happens if you do 🙂
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply