May 18, 2012 at 10:10 am
Dear All,
I was trying to import data into a new table but, I received an error message saying log file space is full, so I changed the Recovery mode to Simple because it is not essential that the Log Files are Backed up; hoping I will have enough space. I want to know what other alternatives I have in order to prevent the error message happening again please.
Thank you in advance!
May 18, 2012 at 10:19 am
You're only options is to manage the log file by backups, simple mode or a code rewrite.
Switching to simple mode will only help if you import method uses individual transactions allowing for the checkpointing process to occurr.
If you're doing an INSERT INTO .... SELECT ...FROM, this is a single transaction and the logfile will continue to grow until it runs out of space or the command completes. If you are doing lots of INSERT INTO ... VALUES() wrapped in a BEGIN TRANSACTION .. COMMIT TRANSACTION this will have the same effect.
What are you using to import the data and do you have any example code?
May 18, 2012 at 10:35 am
Thank you for your reply!
While the Database was set to Full Recovery, I run the following script
INSERT INTO NewTable
(
column1,
column2
)
SELECT
column1,
column2
FROM
table1 app
INNER JOIN table2 aps ON ID = ID
then I received the error message. Then I changed the Database to Simple and I managed to clear some space by deleting unwanted data and I rerun the above script but the transaction was still taking up more space.
Thank you!
May 18, 2012 at 10:39 am
tt-615680 (5/18/2012)
Thank you for your reply!While the Database was set to Full Recovery, I run the following script
INSERT INTO NewTable
(
column1,
column2
)
SELECT
column1,
column2
FROM
table1 app
INNER JOIN table2 aps ON ID = ID
then I received the error message. Then I changed the Database to Simple and I managed to clear some space by deleting unwanted data and I rerun the above script but the transaction was still taking up more space.
Thank you!
Assuming you have a lot of rows in at least one of those tables I would consider looking at a way of breaking them into smaller workable chunks. It would likely have to be a while loop driving the import off the ID columns.
How many rows are in each of those tables?
May 18, 2012 at 10:50 am
its very large around 2500000 from all tables.
May 18, 2012 at 10:57 am
tt-615680 (5/18/2012)
its very large around 2500000 from all tables.
2 million rows from all tables don't seem too much.
Do you have many VARCHAR(MAX), TEXT, VARBINARY(MAX), etc. columns in your tables?
I would also advise you (like MysteryJimbo) to create a WHILE loop and use @@ROWCOUNT to load the tables every X rows, for instance.
Best regards,
Andre Guerreiro Neto
Database Analyst
http://www.softplan.com.br
MCITPx1/MCTSx2/MCSE/MCSA
May 18, 2012 at 11:00 am
Try using the INTO statement if you can. This will only log the allocations and not the individual rows in the simple recovery as well as the bulked loged model:
SELECT
column1,
column2
INTO NewTable
FROM
table1 app
INNER JOIN table2 aps ON ID = ID
May 18, 2012 at 11:07 am
There are some varchar(max) datatype columns and also some text datatypes but one of the table has over two hundred columns.
Thank you!
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply