December 11, 2007 at 8:22 am
Sarah,
You don't really need to go the activex route, although that's one way to go. Some suggestions:
- Rather than importing each file into a separate table, have you considered the possibility of importing into a single table, adding a column [Source File] to separate the data? This'll give you a common, easy to use target. (You could overlay a view on this table, if you really need the separate constructs.)
- Create a simple SSIS package which imports a sample file into the table. Add in a step to set [Source File] to your imput file. Use package variables to hold the file name. This package can easily reference the first 4 columns, disregarding the rest of the information.
- Create another "looping" package, to loop through the directory, picking up the files and setting the package variable, then calling your first package.
December 12, 2007 at 6:12 pm
Hey Cliff,
I found one of my issues in the excel files themselves, the sheets should have been all named the same, it was not looking for the previous file it was looking for the previous tab that happened to be the same name as the file a quick tweak to the vba code I used to produce the files and life is good on that front. I did not have my head in the right place until last night to spot it.
The issue I have now is the file system object line, it simply hangs. Do you know if you need to add a reference to DTS somehow for use the file system object? That is all the original authors code so I am thinking something in my set up? Any thoughts from anyone would be great. Any way to trap it so that I do not have to start over each time it hangs?
December 12, 2007 at 6:26 pm
Hey Cliff,
Found my last issue, I did not have the trailing " \ " for my global variable archive path. Man I hate getting hung up on dumb stuff!! When do I get to stop being new :hehe:
Thanks for taking the time to look at my stuff!! I will play with the code you provided with your last answer.
October 28, 2008 at 1:44 pm
Thanks!! This article helped me tremendously!!
October 30, 2008 at 10:34 am
I too, have a similar question regarding the importing of multiple text files into one table. All of the text files will exist in the same folder.
We are currently importing one text file and this will soon be changing this to be importing up to 250 text files. My process currently truncates the table, imports the text file into the table, does some reformatting and outputs.
I need the same simple process to flow, but just need it to include all 250 files rather just the one. I'm getting more familiar with Active X, but haven't worked with it a lot. Can anyone offer some advice? I have attached my current dts.
October 30, 2008 at 10:54 am
At which point are you truncating the table?
You really just need to implement a looping mechanism
get file
load file
get next file
repeat until no files remain
continue processing
Correct?
October 30, 2008 at 1:19 pm
Yes, I believe that is what I want to do, but not sure how. I'm truncating the table in the first step. I want the table empty before I import any files.
October 31, 2008 at 1:30 pm
Ok, so I think what you want to do is after your truncate table step, add an activex script. After the table load, add a another activexscrip that will determine if you loop back to the table load or move on.
Here is a great article on how to set this up.:
May 21, 2011 at 7:05 am
Hi..I know it's very old post but just trying my luck, how did you fix the error "File Already exists"? I am trying to use the DTS to import multiple files from a folder. The process starts but gets stuck on the first file and keeps on loading that. I see that it is giving error "File Already Exists" on the loop around part.
Any help is appreciated.
Thanks
May 23, 2011 at 7:07 am
Sorry, but not going to be able to help. We chose a different option than what was discussed, and the error was no longer issue at that point. Good luck!
May 24, 2011 at 6:51 am
I saw the recient post and thought I'd add a suggestion. It's probably way too late for the original poster, but hopefully will help someone else.
First, you'll want to start with two directories. Let's assume one is c:\myProject\Data and c:\myProject\Processed.
Next, set up a SSIS task with the following workflow:
Next Set up a user variable of type string, say: FileName
SQL Task: Truncate target table
Foreach loop: Foreach File Enumerator, using c:\myProject\Data and with FileName in the variable mappings (index 0).
Within the Foreach Loop, setup tasks to:
Process your file
Move the file to c:\myProject\Processed.
Now you could simply delete the file instead, but if something goes wrong you may be hard pressed to rerun your job (unless you have other backups). Easier to simply clear out the processed directory on a periodic basis.
There's lots of variations on this of course, backup files first, run edit checks after importing everything, etc. but this gives you the basic structure.
Viewing 11 posts - 16 through 25 (of 25 total)
You must be logged in to reply to this topic. Login to reply