May 11, 2010 at 11:57 am
I am using SSIS 2005 to import a text file, do some lookups and then send the output to subfolders on the destination hard drive based on certain criteria. Is SSIS 2005 capable of doing this? If so, how?
May 11, 2010 at 12:24 pm
If i understand correctly, you want to process bunch of files and then move them to different folders based on some criteria.. is that correct?
May 11, 2010 at 12:50 pm
Exactly! For example, let's say there is a "Group" column in the data set. Based on that value, the output file should go to either:
\Main Directory
\\ Group 1
\\ Group 2
\\ Group 3
May 11, 2010 at 1:12 pm
Might be easiest to output to a single folder and then run a simple piece of script to do the move.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
May 11, 2010 at 2:22 pm
imani_technology (5/11/2010)
Exactly! For example, let's say there is a "Group" column in the data set. Based on that value, the output file should go to either:\Main Directory
\\ Group 1
\\ Group 2
\\ Group 3
My approach would be to place all the tasks in For Each loop - DataFlow task that will process the file and below that use a execute SQL Task to get the Group Column Value and below that place 4 File system Tasks and move them using Precedence constraint OR you can use a script component and write an IF ELSE Condition to move your Files instead of using File sys task with Precedence Constraints
May 11, 2010 at 9:05 pm
Hi there
Yes you can do this... preferrably get all output to one folder... Use the Foreach loop container to do the task... I have a sample ssis package that I used when I was working on this project. The sample package is on dummy table I created called Products in my local machine... please find attached is the package. this may help.. i have exported all data to Excel... The logic is the same for the Flat file as well... Just change excel destination to Flat file...
Moving files can be done with File System task... Refer to the article by Rafael Salas... It helped me a lot... Let me know if i can try and help
http://rafael-salas.blogspot.com/2007/03/ssis-file-system-task-move-and-rename.html
Cheers
vani
May 13, 2010 at 8:22 am
You can also do a conditional split based on that column which can then send the data down the pipeline to the appropriate destination file output (located in the directory that you're wanting). That would remove the need to script anything out to do any post-processing/output file moves.
If the processing is the same for all files (the destination is the only difference), then use a derived value to flag the file's ultimate destination based on that column earlier in the data flow, and then place the conditional split (based on that derived value/flag) at the end of the data flow, outputting to the appropriate file destination.
May 13, 2010 at 8:37 am
dg227 (5/13/2010)
You can also do a conditional split based on that column which can then send the data down the pipeline to the appropriate destination file output (located in the directory that you're wanting). That would remove the need to script anything out to do any post-processing/output file moves.If the processing is the same for all files (the destination is the only difference), then use a derived value to flag the file's ultimate destination based on that column earlier in the data flow, and then place the conditional split (based on that derived value/flag) at the end of the data flow, outputting to the appropriate file destination.
That would require a design-time connection to every possible destination - and therefore a design change whenever a new destination is required. Plus all of the paths would have to be created at design time, leading to a solution which would be difficult to maintain, IMO.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
May 13, 2010 at 9:08 am
In a situation with a large number of directories/an environment where there are always changing destination directories, I agree about the number of connections and maintenance considerations. (depends on the situation)
In a situation with a large number of, or potentially-changing directories, (using what appears to be a home directory/sub-directory structure, based on the original posts), that column in the data file can be read/parsed into an object variable, processed in a For Each Loop where file data matching that particular object instance is processed, and then output to a non-validated, expression-based file connection where the object instance is placed into the expression variable by a write to variable script inside the loop. That would accomplish this with a single, dynamic connection, handling a variable number of unique groups/directories in that file column.
May 13, 2010 at 9:12 am
That's a much better idea.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply