December 16, 2003 at 8:50 am
Hi,
I'm creating a table in SQL 7. The table has 104 columns, all with a varchar(255) datatype. It's used for a bulk import. I get the following message after creating the table in Query Analyzer:
"The total row size (26312) for table 'TBL_MUNIDIR_2003' exceeds the maximum number of bytes per row (8060). Rows that exceed the maximum number of bytes will not be added."
The table def'n is based on legacy code and has never had problems in the past.
Has anybody ever come across this?
Many thanks.
Many thanks. Jeff
December 16, 2003 at 9:12 am
I guess this is expected behaviour.
Take a look at this
http://www.sqlservercentral.com/columnists/sjones/pagesize.asp
Frank
--
Frank Kalis
Microsoft SQL Server MVP
Webmaster: http://www.insidesql.org/blogs
My blog: http://www.insidesql.org/blogs/frankkalis/[/url]
December 17, 2003 at 10:25 am
In the real world, the situation you describe is common. You've never had a problem because most of the data in those varchar columns happens to be far smaller than the maximum allowable.
Tables created through DTS to import data into end up with varchar(255) columns by default.
One of the recent worst practices articles (if I remember correctly) mentioned that this is not a good idea, in that it is possible for an insert to fail it the data ever does get up to that point.
If the import table is around for the long haul, it's not a bad idea to get better column definitions established between you and the data provider. That way, it may be easier to figure out what data is causing a problem. Mind you, as long as the imported data remains below the threshold, you're fine.
The primary advantages of improved column definitions are that data problems can be avoided up front. If the data provider is will to say that the "lastname" column will never be longer than 50 characters, the database can enforce that. Tuning the data*types* can be even more important - if you are expecting an integer value, a failure to insert the data would help identify bad data or a corrupt file. It's a lot easier to mess around with fixing the data before it's imported than to come back after a year and realizing that dozens of records show "number of family members" as 'Columbus'....
R David Francis
R David Francis
December 17, 2003 at 12:49 pm
Wonder if there is a workaround, by changing the columns that might have longer data to type text, which I understand isn't constrained in total length to 8K.
Data: Easy to spill, hard to clean up!
December 18, 2003 at 7:38 am
Hi,
Great suggestions ... and I totaly agree. I'm using legacy SQL scripts that included this in the create table definition.
What is also interesting is that I noticed if I let the DTS wizard create the destination table for me (vs populating the one I created with the varchar255 columns)that I don't get this error message AND it creates column definitions of varchar(8000).
Any thoughts on this??
Thanks. Jeff
Many thanks. Jeff
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply