February 4, 2011 at 5:30 am
I'm attempting to use Microsoft XML BulkLoad for SQL Server (SQLXMLBULKLOADLib) to load a six million row file to a single table.
The process aborts with The transaction log for database [] is full.
This is running from a .NET application and I'm using the following parameters:
Dim connectionString As String = "Provider=SQLOLEDB.1;Persist Security Info=False;User ID=" & AppMgr.DBuser & _
";Password=" & AppMgr.DBPassword & _
";Initial Catalog=" & AppMgr.DBName & _
";Data Source=" & AppMgr.DBServer
Dim errorFile As String = AppMgr.clsConfig.FilePath & "error.xml"
xmlBulkLoad.ConnectionString = connectionString
xmlBulkLoad.ErrorLogFile = errorFile
xmlBulkLoad.Transaction = False
xmlBulkLoad.ForceTableLock =False
xmlBulkLoad.CheckConstraints =False
xmlBulkLoad.BulkLoad =True
xmlBulkLoad.Execute(AppMgr.AppDirectory & "E4.xsd", gsDestinationPath & InputConvertedTo)
The process aborts and the following is logged in the error.xml file:
"The transaction log for database 'xxxxxx' is full"
The code tests fine for smaller 100,000 row files but I get a few of these huge files.
Shouldn't this be a mimimally logged operation?
What am I doing wrong? Any help is greatly appreciated.
Sid
PS. I'm using Microsoft XML BulkLoad for SQL Server 4.0 Type Library
and the Database is in "Simple" recovery mode.
April 17, 2012 at 8:11 pm
Did you ever find out the reason/solution? I am planning on using sqlxmlbulkload to load a large and complex xml file and obviously this would be a major concern. Thanks.
April 18, 2012 at 6:51 am
Yes I did resolve this and it works quite well now.
It a matter of both the attributes you give the Bulk Load Object and how the tables are setup.
For the C# code I'm using:
xmlBulkLoad = new SQLXMLBULKLOADLib.SQLXMLBulkLoad4();
xmlBulkLoad.ConnectionString = connectionString;
xmlBulkLoad.ErrorLogFile = errorFile;
xmlBulkLoad.Transaction = false;
//xmlBulkLoad.ForceTableLock = True
xmlBulkLoad.CheckConstraints = false;
xmlBulkLoad.BulkLoad = true;
xmlBulkLoad.ForceTableLock = true;
//xmlBulkLoad.IgnoreDuplicateKeys = True
But now for the tables which you are loading to, you'll need to not use
any CLUSTERED indexes. You can have indexes but make them NONCLUSTERED.
I also use these (loading into) tables as a staging area. This means that after the load completes I quickly move the data back out to the real processing tables in our system and delete the rows from the staging tables.
I hope this helps,
Sid
PS. I forgot that I also changed the connection usage:
string connectionString = "Provider=SQLNCLI10;Persist Security Info=False;User ID=" + Global.DBuser
+ ";Password=" + Global.DBPassword
+ ";Initial Catalog=" + Global.DBName
+ ";Data Source=" + Global.DBServer
+ ";DataTypeCompatibility=80";
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply