May 27, 2018 at 6:53 am
Hi,
I am using SQL Openrowset to read the file and load the data into a table. The file contains about 150 million records worth data. When I try to load the data the transaction log bloats because of one single transaction and it fails due to "The transaction log for database 'DB' is full due to 'ACTIVE_TRANSACTION'. The log size is as big as 110GB. I have set my database recovery model to "Bulk-Logged". The table to which I am loading the data does not have any indexes.
I am trying to find out how ROWS_PER_BATCH in OPENROWSET will be useful if I use it my query. Will SQL server really consider the value provided in ROWS_PER_BATCH and split the transaction according to that, or will it still consider as one single transaction.
Can anyone help me out in this regard.
Below is my sample query which I am using:
begin try
delete from emp
insert into emp (EmpID, EmpFirstName,EmpLastName,EmpLocation)
select EmpID, EmpFirstName,EmpLastName,EmpLocation from openrowset(
BULK 'D:\Test\Emp.txt',
FORMATFILE = 'd:\Test\Employee.XML',
FIRSTROW = 1,
MAXERRORS = 10,
rows_per_batch = 5
) as a
end try
begin catch
select ERROR_MESSAGE()
end catch
end
GO
Thanks,
Aravind S
May 28, 2018 at 3:58 am
Use dynamic SQL to build and execute your code and run through a while loop
declare @insert varchar(1000),
@row as int,
@max as int,
@path varchar(300),
@filename varchar (100)
set @path='C:\DB\'
set @filename ='testfile.csv'
set @row=1 --start at row
set @max-2=10 -- import batch size
while @row < @max-2
begin
set @insert=' BULK'
set @insert=@insert+' INSERT #IMPORT '
set @insert=@insert+' FROM '+CHAR(39)+@path+@filename +CHAR(39)
set @insert=@insert+' WITH'
set @insert=@insert+' (FIRSTROW = '+cast(@row as varchar)+',FIELDTERMINATOR = '','',ROWTERMINATOR = ''\n'')' --'
exec (@insert)
select @row
set @row =@row + 1
end
***The first step is always the hardest *******
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply