March 22, 2010 at 7:59 pm
I am doing a heavy dataload on a database which is running from 18hrs and unfortunately the disk was gettiing full and when checked, db lof was taking more space and tried shrinking the database. I am not sure how correct this step is, could someone explain what worse could happen.
I dont want to stop the load job as that would roll back all updates which would take another 18hrs.
Thanks
March 22, 2010 at 8:03 pm
I am trying to shrink the db but could not do it as the INSERT's are blocking, how would i proceed ?
March 22, 2010 at 9:46 pm
I wasn't clear on whether or not the problem is with your log file - or your data file. If it is your log file, how often are you backing up the transaction log?
If your database is in simple recovery, then the only option you would have is to batch the load instead of trying to perform it all in a single batch.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
March 23, 2010 at 12:35 am
Why are you not using bulk insert or BCP ? Sending so many insert statements is going to take time. Also you can explore MERGE statement in SQL 2008 if it suitable to your requirement.
March 23, 2010 at 1:04 am
Tara-1044200 (3/22/2010)
I am trying to shrink the db but could not do it as the INSERT's are blocking, how would i proceed ?
Correct, the insert must finish before you can shrink the database.
18 hours to perform the insert? What is the method being used? How many records are involved with this insert?
What is the recovery model of your database?
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
March 23, 2010 at 4:16 am
Read the gail's article http://www.sqlservercentral.com/articles/64582/
Muthukkumaran Kaliyamoorthy
https://www.sqlserverblogforum.com/
March 23, 2010 at 4:36 am
Tara-1044200 (3/22/2010)
I am doing a heavy dataload on a database which is running from 18hrs
can you post the query/script you are using to insert data ?
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
March 23, 2010 at 5:09 am
it is in simple mode and i am using a stored procuedure where it has diffrent business criteria to read data from a production server and then insert into this database using regulr INSERT INTO..SELECT statements.
March 23, 2010 at 5:23 am
I killed the transaction as the dsik was compeltely full and it may take same time to roll back, i am not sure if there a best way to fiinish it soon and release the space from log.
few questions?
1. once the session is rolled back compeltely does the log released by itself or do i have to truncate it or how does check poiint work to clear the log ?
2. I dont want to wait for so long(18 hrs) to rollback, is there a way to make database available during roll back?
3. What would have happen if the disk is comepltely FULL, do the box restart by itself?
March 23, 2010 at 5:50 am
1. once the session is rolled back compeltely does the log released by itself or do i have to truncate it or how does check poiint work to clear the log ?
i dont think it is putting impact on log as Db is simple mode
2. I dont want to wait for so long(18 hrs) to rollback, is there a way to make database available during roll back?
it should not take 18hrs to rollback
3. What would have happen if the disk is comepltely FULL, do the box restart by itself?
Box will not get restarted when the disk is comepltely FULL, i dont think so
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
March 23, 2010 at 5:52 am
You can do the insertion in batches like 10000 rows in every insert/execution
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
March 23, 2010 at 6:08 am
the log file is as much as usual database size, i think though it is in simple mode if the transactions are not cimitted it will grow big.
what would happen if i back up the database a this time, does it also backup log too ?
March 23, 2010 at 6:32 am
If the database is using the SIMPLE recovery model, backing up the database periodically isn't going to help. You need to look at the data going in to the system and see why it is causing the database growth. It could be due to page splits on the table(s) for one.
Again, how much data is being loaded, and is it possible to temporarily drop the indexes during the load? Also, look at batching the data into separate transactions to minimize t-log growth.
March 23, 2010 at 6:37 am
Full backups do not truncate the transaction log. Only a log backup truncates inactive log records and makes the space in the log available for reuse. Without transactionlog backups, the logs will continue to grow.
For transactional logs ,read it http://support.microsoft.com/kb/873235
and for backup in Simple recovery mode http://msdn.microsoft.com/en-us/library/ms191164.aspx
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
March 23, 2010 at 7:07 am
Lynn Pettis (3/23/2010)
If the database is using the SIMPLE recovery model, backing up the database periodically isn't going to help. You need to look at the data going in to the system and see why it is causing the database growth. It could be due to page splits on the table(s) for one.Again, how much data is being loaded, and is it possible to temporarily drop the indexes during the load? Also, look at batching the data into separate transactions to minimize t-log growth.
is it fast doing load by dropping indexes comapred to loading and revuilding indexes ? what would be the best option ?
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply