November 19, 2015 at 9:35 am
I have a job that pulls data from teradata to SQL server 2012 through linked server. It's running since long and suddenly it got failed with above message. What could be the reason? Any help will be helpful
November 19, 2015 at 9:39 am
This is very vague question. Please copy/paste the error message in readable format that you are getting.
November 19, 2015 at 9:43 am
Error is: COuldn't allocate space for object 'XYZ: temporary run storage' in databas tempdb because the primary filegroup is full. Create disk space by creating unneeded files. SQLState 42000(error1105)
November 19, 2015 at 11:16 am
the error is fairly straightforward, tempdb ran out of space.
Typically this is because tempdb auto expanded until it literally fill up the disk is is on.
That assumes you have one or more files for temp db all on the same drive.
it coudl also be that tempdb was limited to a specific max size, which would return the same error.
you will wnat to either get more disk space, or change your ETL to grab things in a more bite sized fashion;
linked servers featuring four part names, like SELECT * FROM Teradata.Database.schema.table need to be changed into OPENQUERY commands, maybe change it to grab smaller slices into a temp table, by date, or some other logical break.
four part calls like i mensioned, when joined to local data, do somethign you don't expect: it copies the ENTIRE remote table(s) from the linked server to local tempdb, THEN performs the joins behind the scenes, and then returns results; OPENQUERY can help alleviate the issue to a degree.
Lowell
November 19, 2015 at 11:55 am
Make sense... to check where it's using more space... i commented few of the parts of original job and executed on my local desktop matching.. it ran fine until it brings data to sql server (around 1 million) and it was very quick..after that i commented out... not i ran that job again which inserts data from staging table (where teradata data was kept) to main table by joining with other tables and group by clause.... it makes data around 25 million in size this step is actually taking more time...
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply