March 4, 2014 at 9:29 pm
Comments posted to this topic are about the item SSIS and Transaction Log Growth
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
March 4, 2014 at 10:36 pm
Interesting question! thanks Craig.
---------------------------------------------------
"Thare are only 10 types of people in the world:
Those who understand binary, and those who don't."
March 4, 2014 at 10:48 pm
Nice question, thanks.
Hope this helps...
Ford Fairlane
Rock and Roll Detective
March 5, 2014 at 12:10 am
Interesting Question .......................
March 5, 2014 at 1:49 am
And if the Maximum Insert Commit Size in the OLEDB destination is already set to a reasonable value you.....
This option might not work - so plitting a source stream into a few hundred streams might be unwieldy but necessary.
March 5, 2014 at 2:57 am
Interesting question, but I do not agree entirely with the answer.
Setting the commit size to a lower value will commit batches before the entire load is done. What if something fails and I want to roll everything back?
Splitting the load up in smaller chunks but in manageable transactions seems to make more sense.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 5, 2014 at 3:02 am
Koen,
I agree with what you say, nothing worse than a partially loaded set of data in this type of scenario, but the question did say the simplest, and changing the commit batch is very simple.
So it's the simplest but maybe not the most sensible for a given situation.
Rodders...
March 5, 2014 at 3:34 am
This was removed by the editor as SPAM
March 5, 2014 at 5:29 am
Koen Verbeeck (3/5/2014)
Interesting question, but I do not agree entirely with the answer.Setting the commit size to a lower value will commit batches before the entire load is done. What if something fails and I want to roll everything back?
Splitting the load up in smaller chunks but in manageable transactions seems to make more sense.
Koen, I don't understand. Setting the commit size to a lower value is having SSIS split the load into more manageable chunks for you, isn't it? If there's a failure, you're not going to be able to roll the whole thing back in either situation (unless, of course, the failure occurs in the first batch).
Stewart "Arturius" Campbell (3/5/2014)
Alternatively, change the recovery model to bulk logged just before the load and back on completion thereof.however, this will require a full backup before and after...
Stewart, I think you only need the full backup if you switch to Simple. Bulk Logged preserves the log backup chain, although if you need to restore to a point in time during the time the database is in Bulk Logged mode, you may not be able to.
John
March 5, 2014 at 5:43 am
John Mitchell-245523 (3/5/2014)
Koen Verbeeck (3/5/2014)
Interesting question, but I do not agree entirely with the answer.Setting the commit size to a lower value will commit batches before the entire load is done. What if something fails and I want to roll everything back?
Splitting the load up in smaller chunks but in manageable transactions seems to make more sense.
Koen, I don't understand. Setting the commit size to a lower value is having SSIS split the load into more manageable chunks for you, isn't it? If there's a failure, you're not going to be able to roll the whole thing back in either situation (unless, of course, the failure occurs in the first batch).
You're right. You have more control over the different transactions (which are smaller in size), but to avoid transaction log growth you still have to commit them.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 5, 2014 at 7:02 am
This was removed by the editor as SPAM
March 5, 2014 at 7:06 am
Koen Verbeeck (3/5/2014)
Interesting question, but I do not agree entirely with the answer.Setting the commit size to a lower value will commit batches before the entire load is done. What if something fails and I want to roll everything back?
Splitting the load up in smaller chunks but in manageable transactions seems to make more sense.
Well, the destination in this case IS a staging table. If you want to roll everything back, you can just truncate it again, no?
March 5, 2014 at 7:15 am
Nevyn (3/5/2014)
Koen Verbeeck (3/5/2014)
Interesting question, but I do not agree entirely with the answer.Setting the commit size to a lower value will commit batches before the entire load is done. What if something fails and I want to roll everything back?
Splitting the load up in smaller chunks but in manageable transactions seems to make more sense.
Well, the destination in this case IS a staging table. If you want to roll everything back, you can just truncate it again, no?
Point taken 😀
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 5, 2014 at 8:57 am
Nevyn (3/5/2014)
Koen Verbeeck (3/5/2014)
Interesting question, but I do not agree entirely with the answer.Setting the commit size to a lower value will commit batches before the entire load is done. What if something fails and I want to roll everything back?
Splitting the load up in smaller chunks but in manageable transactions seems to make more sense.
Well, the destination in this case IS a staging table. If you want to roll everything back, you can just truncate it again, no?
Utmost simplicity, thanx.
Thanks & Best Regards,
Hany Helmy
SQL Server Database Consultant
March 5, 2014 at 8:58 am
Interesting question and discussion.
Thanks & Best Regards,
Hany Helmy
SQL Server Database Consultant
Viewing 15 posts - 1 through 15 (of 32 total)
You must be logged in to reply to this topic. Login to reply