June 19, 2007 at 1:50 am
Warning: The table '#tempActualResults' has been created but its maximum row size (21366) exceeds the maximum number of bytes per row (8060). INSERT or UPDATE of a row in this table will fail if the resulting row length exceeds 8060 bytes.
1)This warning does not seem to be effecting the results of this SP, so can I turn of the warning? this report is being sent to users and this make them very nervous.
2) Can I increase the maximum row size?
Many thanks
June 19, 2007 at 2:31 am
Maximum size allowed in a row in a table is 8060 bytes exceedin this you will get a warning as what you have said. Yu can turn off this warning for the report you have said but check if the results are not truncated.
Cheers,
Sugeshkumar Rajendran
SQL Server MVP
http://sugeshkr.blogspot.com
June 19, 2007 at 6:56 am
Though it appears to be a warning while table creation, update & Inserts will fail with error
Cannot create a row of size xxxx which is greater than the allowable maximum of 8060.
The statement has been terminated.
Instead, try to change schema by making some of the fields as text/next, but you need to be careful while doing so, as those fields can't be passed as input to your sps. You can check BOL for managing text/next data.
Regards
Shrikant Kulkarni
July 3, 2007 at 8:26 am
I am curious, what kind of data are you collecting that you need a row defined that is larger than 8060?
Steve
July 4, 2007 at 2:50 am
Just to make sure, the answer to your question 2) is No.
I suppose you have some large varchar/nvarchar columns. These are of variable size, and as long as you don't really try to insert that many characters, the row is shorter. Once you reach the maximum length of a row, you have a problem... so I wouldn't just skip the warning.
- if the columns are nvarchar, consider changing them to varchar (unless you need to use some chinese, japanese or similar character sets, that should be OK). It cuts the size to half.
- check whether the columns need to be as large as they are; if you have defined a column as VARCHAR(1000) "just in case it happens", but so far it only contains maximum 300 characters, tune it down to VARCHAR(500) or something like that, and make sure that hypothetical longer entries will be truncated properly before insert
- consider splitting the table into several tables
And yes, I'm also curious what data it is... doesn't happen often to get a table that big.
July 4, 2007 at 3:25 am
> if you have defined a column as VARCHAR(1000) "just in case it happens"
In case when it happens - insert fails.
So, there is no point to hold exceeding length of varchar columns.
It's dangerous.
_____________
Code for TallyGenerator
July 4, 2007 at 5:02 am
Thanks all ,I have trimed most of the sizes down from 250 to 50 where i could and now no warnings. There are a total of 90 columns in this table!!! It has so many because it is a report and they want there report to have 90 columns. it is in this format hope it makes sense.
Cat1 Cat2 Cat3..etc
location case1|case2|case3|ect case1|case2|case3|ect case1|case2|case3|ect....
1 xxxxx xxx xxxx xxxx xxxxx xxxxx xxx xxxxx xxxxx....
2 xxxxx xxx xxxx xxxx xxxxx xxxxx xxx xxxxx xxxxx.....
3 xxxxx xxx xxxx xxxx xxxxx xxxxx xxx xxxxx xxxxx.....
July 4, 2007 at 6:01 am
To be honest, I don't think it makes sense at all... A report with 90 columns is bad idea. Do you think anyone will be able to work with it efficiently? Yeah, I know that it wasn't your idea, someone asked you to do it like this - but anyway, it could be worth starting some communication to find out why they want it, what will they do with it, and whether some other format would not be more helpful.
Probably I've seen too many dumb requirements to take for granted that it HAS to be done this way, just because someone wasn't smart enough to come up with a better requirement. It seems like one of those cases where users tell you HOW to do it, instead of explaining what they need it for.
Anyway, good luck and good nerves
July 4, 2007 at 6:14 am
The client is not flexable (NHS). They always make it very clear they have much better things than to talk to IT. No wonder so many IT projects fail in the NHS, it must always be done there way,there are a lot of personalities in the NHS. I have found them the hardest group to work with, getting a spec from them is like getting water from rock. Leaves me gasping and stuttering as I try to explain I am trying to help them not hassle them. Oh well its all part of the spice of life. I like simple and useable.
July 4, 2007 at 6:29 am
Hi,
page size in SQL Server is 8060, and the since a row cannot cross a page boundary, row size presumably cannot exceed that .
From Microsoft's Q & A http://support.microsoft.com/kb/260418
Q. When I create a table, I get the following 2714 error message:
The table creation succeeds and data can be inserted without problems. What did this error message mean?
A: This error message indicates that you have variable length columns in your table (such as nvarchar or varbinary) and that the total maximum length of all the columns adds up to more than 8060 bytes. You can still insert rows into the table provided that the total length of the data in each row does not exceed 8060 bytes. However, if the data does exceed 8060 bytes, the insertion fails with the following error message:
Cannot create a row of size <rowlength> which is greater than the allowable maximum of 8060.
The statement has been terminated.
See also Steve's article on this site on maximum page size
http://www.sqlservercentral.com/columnists/sjones/pagesize.asp
EDIT: and his useful page on capacity specs
http://www.sqlservercentral.com/columnists/sjones/maximumcapacityspecificationsinsqlserver2005.asp
I have already run up against the number of tables per SELECT statement limit
David
If it ain't broke, don't fix it...
July 4, 2007 at 6:37 am
You could try to create a staging area, and use TEXT as column data type. Then import to this table. Performance will decrease while importing.
It is then fairly easy to extract each and one column with a function like this (http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=76033) to store the data your way.
N 56°04'39.16"
E 12°55'05.25"
July 4, 2007 at 6:56 am
Peter, why are you trying to create a problem where it's already solved?
_____________
Code for TallyGenerator
July 4, 2007 at 7:10 am
Hi,
Rather than compromise the performance of the system, or leave a timebomb waiting to explode for that one particular record where row length > 8060 bytes you could
1. Patiently explain to your client that they have reached a physical limitation of the database platform they have chosen, and they need to moderate their requirements or face problems in the future. You can use the Datalength funtion to return the maximum column data size.
select top 1 datalength(mycolumn) from mytable
order by datalength(mycolumn) DESC
2. Talk to the end-users to find out if they really really want a 90 column report in 8pt pica
3. I am wondering if the row size applies only to the table, and not to the results set, in which case you could creatively join your temporary table with permanent tables to produce the require results set (I have used this solution in some reports, but for a different reason, I have a standard process to produce a base report, and several regional variations of the report with additional information contained in the permanent schema). I will experiment and get back this thread
David
If it ain't broke, don't fix it...
July 4, 2007 at 8:26 am
Hi David I will try point 3 seems that would be a good soultion failing that your right I will have goto point 1 and 2. Many thanks
July 4, 2007 at 8:38 am
I have just created 3 temp tables with 7,000 row size giving a total of 21,000 row size. I then inner joined them and there was no warning. Great!!!!
Viewing 15 posts - 1 through 15 (of 18 total)
You must be logged in to reply to this topic. Login to reply