June 12, 2012 at 8:51 am
More of a comment than a query but here goes:
I've got an SSIS package which is processing a very large number of very small files.
The files are typically 1-20kb in size and are being cleaned by SSIS in a data flow with a couple of derived columns and then saved into another folder.
There is nothing in the package which is hitting SQL Server and no logging has been implemented.
This is literally the bare bones package needed to complete the task of moving files from A to B with a tiny bit of cleansing inbetween
Despite this the memory usage when running in the BIDS environment is huge, 1.5GB and still rising when I killed it, the process was actually grinding to a halt by itself anyway.
The same process ran using DTExec has got to the same point with less than 100MB used.
What can be done to reduce memory usage when running in BIDS?
and similarly, is there anything which can be done to improve the memory usage when running in DTExec? I'm not too impressed with the ever rising memory usage when such small files are being processed
June 13, 2012 at 1:18 am
Just reading files, derived columns and saving them to another file? So no blocking transformations in your package?
It seems odd BIDS needs 1.5Gb for that. On the other hand, I'm never surprised when it comes to BIDS, it can behave strange from time to time.
Other question: why would you want to improve the memory used by DTEXEC? Be glad it's only 100MB 🙂
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
June 13, 2012 at 1:56 am
Koen Verbeeck (6/13/2012)
Just reading files, derived columns and saving them to another file? So no blocking transformations in your package?It seems odd BIDS needs 1.5Gb for that. On the other hand, I'm never surprised when it comes to BIDS, it can behave strange from time to time.
No blocking transforms, lookups or anything. Just a foreach loop pointed at a folder and a few thousand files to clean and then dump elsewhere.
Koen Verbeeck (6/13/2012)
Other question: why would you want to improve the memory used by DTEXEC? Be glad it's only 100MB 🙂
Because 100MB to sequentially process files no bigger than 30KB still seems rather excessive!
I can watch the memory usage slowly increase as it processes more and more files. From a reasonable ~20MB to over 100MB
June 13, 2012 at 2:23 am
Samuel Vella (6/13/2012)
Koen Verbeeck (6/13/2012)
Just reading files, derived columns and saving them to another file? So no blocking transformations in your package?It seems odd BIDS needs 1.5Gb for that. On the other hand, I'm never surprised when it comes to BIDS, it can behave strange from time to time.
No blocking transforms, lookups or anything. Just a foreach loop pointed at a folder and a few thousand files to clean and then dump elsewhere.
Koen Verbeeck (6/13/2012)
Other question: why would you want to improve the memory used by DTEXEC? Be glad it's only 100MB 🙂Because 100MB to sequentially process files no bigger than 30KB still seems rather excessive!
I can watch the memory usage slowly increase as it processes more and more files. From a reasonable ~20MB to over 100MB
Ah ok, so you didn't mean improve but reduce 😉
I've read somewhere before that the For Each Loop can be buggy and doesn't release memory as it should, so that memory is added for each iteration of the loop, causing the loop to behave like a memory leak.
(it looks a lot like this connect bug: http://connect.microsoft.com/SQLServer/feedback/details/219782/ssis-loop-does-not-release-memory)
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
June 13, 2012 at 2:45 am
Koen Verbeeck (6/13/2012)
Ah ok, so you didn't mean improve but reduce 😉
Sorry for the confusion <insert sarcastic comment about improving memory usage so it uses more /> 😀
Koen Verbeeck (6/13/2012)
I've read somewhere before that the For Each Loop can be buggy and doesn't release memory as it should, so that memory is added for each iteration of the loop, causing the loop to behave like a memory leak.(it looks a lot like this connect bug: http://connect.microsoft.com/SQLServer/feedback/details/219782/ssis-loop-does-not-release-memory)
It does look a lot like that, I'll see if it's any better with a child package in there instead. Not too bothered about this instance as it was a one off job but it might help future tasks.
I'll let you know how it goes.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply