December 23, 2005 at 9:25 pm
Windows 2000 Server SP4, Dual 2.4 GHz P4 Xeon machine, 3776 MB RAM
SQL Server 2000, all patches and service packs applied
I am trying to import 6 text files (comma separated data) with roughly 9000
rows and 900 columns (file extension is .txt not .csv). Each file is about 10
MB in size.
I have researched my issue thoroughly in Google (web and groups) and
couldn't find the answer to my question.
When I try to import each of these files using DTS (text file as the
source), I get the 'Not enough storage is available to complete this
operation' error. There is plenty of hard drive space and I got rid of lots
of tables to free up the database. The database has about 250 MB of free
storage. All files are set to grow unlimited.
Does anyone know how to solve this problem? What are the settings I can
tweak? Is there any hotfix (I don't think so) available from Microsoft that
addresses this issue?
I am positive that this, even though it says that, is not a space issue.
There must be some settings which may need to modify.
Please help if you can.
Thanks.
December 24, 2005 at 6:37 am
I regularly use DTS to load text files that are > 150MB and contain over 150k rows, but I've never come close to 900 columns! Do you think you could be running up against the 8036 byte row limit? You may need to partition this data into multiple tables. Hope this helps..
Cheers..
December 24, 2005 at 7:26 am
Thanks.
Could you please tell me what limit are we talking about? I will try partitioning the text file to include 450 columns instead of 900 but are there any other ways to get this done?
December 24, 2005 at 7:53 am
A table can contain a maximum of 8,060 bytes per row. (SQL 2005 doesn't have this restriction, but performs poorly when the row is > 8060) It's related to the 8k page size.
December 24, 2005 at 7:44 pm
Thanks for all your help.
Each DTS I have defined creates one table. I then want to join this 6 tables side by side to create one huge table.
I have a VBScript that generates the final output of the DTS. So, I am already creating the final table with 900 columns and 9000 rows. The problem with VBScript is, it takes a long time and is very error prone.
So, I don't know whether the table row size is really my problem or this error is mis-guiding in that it doesn't really have anything to do with physical storage?
December 28, 2005 at 8:50 am
I agree, it sounds like the "row-size" limit. Are you implicitly declaring the datatype/size of each column? If not then you maybe defaulting to the varchar(8000) or nvarchar(4000). That means that the max size of your rows could be well over the limit.
MAX VARCHAR SIZE OF YOUR RECORD = 900 * 8000 * 1024 = 7372800000
MAX NVARCHAR SIZE OF YOUR RECORD = 900 * 4000 * (1024 * 2) = 7372800000
Both of these are WAY over the limit.
Odds are your records aren't that large, you simply haven't the target tables.
Don't count on the wizards to do everything(Sorry, looks like you'll have to a little work).
If your files are truly greater than 8060 bytes per row you will have to revise your process to either
1.) Create the target tables before load and change your load process to parse through the records, placing the data into the appropriate tables.
2.) Load each record as a TEXT data type column into a staging table and parse through the table to create the desired target tables.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply