October 2, 2007 at 11:11 am
I've inherited a DTS package that (among other things) was importing an Excel spreadsheet every day to a SQL table (from there the data's scrubbed and goes merrily on its way)...when it was originally set up, it was a static Excel file (same name every day)
Now the file's being delivered to its directory by another process and it comes with a new name every day...and the package blows up on the Transform Data Task
I've tried adding an ActiveX Script Task that gets the file's name every day followed by a Dynamic Properties Task that sets the Excel Connection's Data Source to a global variable (set in the ActiveX Task)...all of the settings look good but the Transform Task still blows up
When it I get a message that says "filename$.xls is not a valid name. Make sure it does not include invalid characters or punctuation or that it is not too long."
The Transform task is set to run from a SQL Query and has a query hard coded in it...e.g.
SELECT NAME, MONTH, A1, A2, A3, FST, LST
FROM [xFilename$]
I've done my due diligence on the web and hacked at it but I can't find the answer...any suggestions would be appreciated
Thanks,
JAB
October 3, 2007 at 8:14 am
Well one way to do it, create a data source using excel spread sheet and select the excel file with the path.
The first step of the DTS Package, I would use active X script to rename the new file to excel file name.
The last step of the DTS Package, I would use active x script to rename the excel file name to file name + YYYYMMDD
So each time you use the same file name to import into the table.
my 2 cents. I always want to do it in the easiest way.
You can use dynamic task and use global variable to change the excel name. But when you call the DTS package, you have to put in the parameter /A "....xls" the whole path name + the file name.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply