February 21, 2014 at 7:11 am
I was in the process of loading hundreds of thousands of legacy data records in the system, but the main issue was that the tempdb grew to almost 40gb, which was hogging up a lot of space. The migration takes about 8 hours which I had to stop. I don't want to keep adding an additional drive. Is there a way we can break down the migration into different parts so we don't run out of space. (We did end up shrinking the temp DB to reclaim some space). Any suggestion will be highly appreciated.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
February 21, 2014 at 7:28 am
New Born DBA (2/21/2014)
Is there a way we can break down the migration into different parts so we don't run out of space.
No idea, I haven't seen it.
If you can break the process down into smaller chunks or batches there's generally no reason for tempdb to grow.
February 21, 2014 at 7:30 am
So is there any other way this problem can be solved?
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
February 21, 2014 at 7:34 am
I think what MysteryJimbo was saying was we don't know if your migration can be broken down because we don't know anything about it, and the information you've given us is almost non-existent. If you explain it a bit more somebody might be able to help :-).
February 21, 2014 at 7:34 am
(Sorry, another double-post I can't explain.)
February 21, 2014 at 7:35 am
No. This is not related to the engine.
Assuming the growth stopped when you stopped the process, its related to the code you have developed.
February 21, 2014 at 9:48 am
Yes, you can break down the process, and probably should. But telling you how based on nothing other than "the process is big" is pretty hard.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 21, 2014 at 9:50 am
In your migration package, are you batching the record export/imports to 10,000 or something other than the default (which tries to do the entire table)?
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
February 21, 2014 at 9:52 am
Thank you all for taking some time out to reply to my post.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
February 21, 2014 at 9:53 am
Default.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
February 21, 2014 at 9:54 am
Try dropping the batch sizes down.
Also, are you loading the table as the source or are you loading it via a tsql statement?
Sometimes SSIS likes (ok a lot of times) to have the source come from a tsql statement instead of the table option.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
February 21, 2014 at 10:26 am
SQLRNNR (2/21/2014)
Try dropping the batch sizes down.Also, are you loading the table as the source or are you loading it via a tsql statement?
Sometimes SSIS likes (ok a lot of times) to have the source come from a tsql statement instead of the table option.
First, I have to figure out how to drop the batch size down. I am using a import/export wizard and not tsql statement.
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
February 21, 2014 at 10:39 am
New Born DBA (2/21/2014)
SQLRNNR (2/21/2014)
Try dropping the batch sizes down.Also, are you loading the table as the source or are you loading it via a tsql statement?
Sometimes SSIS likes (ok a lot of times) to have the source come from a tsql statement instead of the table option.
First, I have to figure out how to drop the batch size down. I am using a import/export wizard and not tsql statement.
K, save the package that gets created from the import/export wizard. This will save it as an SSIS package that you can edit/customize.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
February 21, 2014 at 11:15 am
Now I got it, Thanks
"He who learns for the sake of haughtiness, dies ignorant. He who learns only to talk, rather than to act, dies a hyprocite. He who learns for the mere sake of debating, dies irreligious. He who learns only to accumulate wealth, dies an atheist. And he who learns for the sake of action, dies a mystic."[/i]
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply