October 18, 2007 at 4:44 pm
Hey,
I have a local folder with csv files.
I want to pickup those files and import into my table.
There will be multiple files in this folder, so I need to loop through and for each file import using dts.
My understanding is that it would be possible by setting up a global variable for the source (which is my text file) in the dts package.
I am having some difficulties and can't get this with global variable to work.
Could someone lend a hand...
~J
October 18, 2007 at 6:11 pm
I'm assuming you've already figured out that the files need to have the same internal architecture (same delimiters, columns names, etc....) I'm also going to assume that you've created a TEXT (source) object which points to/has the definition of one of those files.
Once you have that:
1. create the global variable. You can get to them from the Package, Properties menu option (global variables is a tab of its own). For illustration purposes let's be creative and call it VarFileName
2. Add a dynamic properties task to your DTS package. In the properties of that task, open up the TEXT (source) object, and find the property dealing with the filename+path (pretty sure that's filename, but I'm not in front of a DTS screen right now).
3. using the workflow tools, make sure that this step fires before the step that imports the text file.
Now - depending on how you intend to fire it, you could:
- run DTSRUNUI, pick the package, then fill in the filename under the global variables
- or look into using DTSRUN in a command-prompt BATCH file. Something like
DTSRUN /SSeverName /E /NPackageName /A:Varfilename:8="myfilepath"
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply