June 23, 2009 at 4:14 pm
hello i am planning to increment the size of a nvarchar column from nvarchar (200) to nvarchar (1000) , i need to be able to enter 1000 characters into that column, my question is would this affect performance?
June 23, 2009 at 7:58 pm
In itself it won't make any difference to performance. The performance for an nvarchar(1) (not that there should be such a thing) and and nvarchar(1000) will be identical if all rows only have 1 character in the field.
However, as you start putting more characters into a variable length column then you will get less rows per page, meaning you have to read more pages to retrieve the same number of rows. This should only be evident on table scans or where queries have to read a large amount of "base data": good indexing may become more important.
July 28, 2009 at 11:58 am
thank you for your answer can this lead for any memory issues?
July 28, 2009 at 4:53 pm
As always, the answer is "It depends".
The effect on memory usage will be identical to that on disk usage. The change in definition of the field won't make any difference, but as rows are created that use more characters then the amount of memory required will increase.
So long as there's memory available then there won't be any performance impact, but once any of the caches fill and can't expand further then you may start to see performance start to degrade, because less rows will be able to fit into a given volume of memory.
The rate at which the memory usage increases will be depend on how many rows have larger data length. If the average data length doesn't change much then the memory usage impact will be negligible, but if the average data length is substantially increased then the memory usage impact, and thus potential performance impact, may be significant.
It also depends on how big a percentage of total memory use this table accounts for.
July 28, 2009 at 6:02 pm
Declaring large field sizes can affect memory usage in Integration Services. When it sets up row buffers it has to allocate space for the largest possible row, so you will get fewer rows per buffer and slow down processing.
July 29, 2009 at 2:35 am
Scott do you mean to say DTS.
"Keep Trying"
August 6, 2009 at 9:56 am
thank you!
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply