September 28, 2004 at 4:10 am
Hi all,
I have a customers database that is reporting through our application that the 'data type' is out of range when it is performing a select on a specific table. Looking in detail at the table it has a large number of columns with large varchar data types. I think it is exceeding the maximum SQL Server Bytes per row allowance, but would like to be able to work out the total number of Bytes that it is trying to return. Does anyone know what the calculation is for this? i.e if it is a varchar 256 - does that mean 256 Bytes? You'll have to excuse my ignorance just getting used to SQL Server.
Thanks all
September 28, 2004 at 4:30 am
This isn't my answer. I saw it posted from a Google search:
How can I find out the maximum row size for a table in which all the varchars are fully populated?
The only way to find a SQL Server table's maximum row size is to query the system tables (or the INFORMATION_SCHEMA views) and add up the byte counts. But note that Microsoft doesn't recommend querying system tables, which can change from release to release. Also note that the total row size never exceeds about 8060 bytes in SQL Server 2000 or 7.0.
September 28, 2004 at 4:41 am
Hi, which system table would I query? syscolumns? if so, which field holds the number of bytes. Is it the length field.............If I produce a sum of the length field will this give me the number of bytes?.......thanx
September 28, 2004 at 1:27 pm
I don't know if this will help. You can cut down on the information returned after you run it.
This is from the CMS database and the table Claims.
SELECT *
FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_CATALOG = 'CMS' AND TABLE_NAME = 'claims'
September 28, 2004 at 4:02 pm
Thanks for your help guys, both ideas have been great. I've sorted the problem now by removing some large columns that were not in use, only a matter of time before the limit gets hits again.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply