August 3, 2005 at 11:28 am
I work on a script, which will import data from a flat file. The file will be manually placed into a network directory each day. A scheduled job will trigger a DTS package and import data from this file. How can I run file checksum from a DTS package to make sure that this is not a duplicate file? Maybe there are some other ways of identifying a duplicate file, using SQL Server 2000?
August 3, 2005 at 12:17 pm
If you're worried about picking up the same file two days in a row. Best/easiest option would be to move the file to an archive location once it has been processed.
A more complicated method, would be to store things like the file size, last updated and date dreated. Then comparing these before you load the file.
--------------------
Colt 45 - the original point and click interface
August 3, 2005 at 3:40 pm
Do you know the way to check the sized, date created and date updated from DTS package?
August 3, 2005 at 3:53 pm
Ella,
Take a look at this site - it has examples for working with files.
http://www.sqldts.com/default.aspx?292
Good Luck,
Darrell
August 4, 2005 at 11:15 am
If you need a file checksum, there is a win32 console app that will calculate MD5 thumbprints for you:
hth jg
August 4, 2005 at 5:25 pm
Thank you everybody. It was a very useful info.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply