Getting a file locking error

  • Hi,

    I have a simple package which reads data from a flat text file into a table in SQL Server. I then have a Script Task following this that renames and moves the file that has just been loaded to another folder.

    When I run the package I get the following error:

    The process cannot access the file because it is being used by another process.

    However when I run just the Script Task on its own I don't get any error - it works perfectly. Any ideas?

    Thanks

  • at first, i thought that it was a locking issue because your data flow still had control of the file when the script task executed. however, i was able to create a package that moved data from a flat file into table and then execute a script task to move the file to another location and had no problems executing the package entirely.

    so, i have a couple of questions then about your design:

    how is your control flow setup? is it a simple data flow task followed by the script task or it more than that going on?

    what does your script task look like for moving the file? are you using the System.IO File.Move method or some other method to move the files around?

    a little more insight into what you have built would help.

    thanks

  • Thanks for your reply, what I have in detail is this:

    Three parallel Data Flow tasks loading three tables from three flat files. The flat files are accessed through three Flat File Connection Managers pointing to each file, an Execute SQL Task is then run, followed by my File operations.

    Like this:

    The housekeeping task is simply this:

    Public Sub HouseKeepFiles(ByVal folder As String, ByVal DaysToKeep As Integer)

    Dim fileEntries As String() = Directory.GetFiles(folder)

    Dim fi As String

    For Each fi In fileEntries

    If File.GetLastWriteTime(fi).Date.AddDays(DaysToKeep) < DateTime.Now Then

    File.Delete(fi)

    End If

    Next

    End Sub

    Which gets called from Main in the Script Task. It seems that the read lock on the files in the Data Flow task have not been released by the time the Script Task runs. Should I put a wait in my script?

  • i would not put a wait on the execution, the execute sql task will not fire until all of the other data flows have completed successfully, which is the wait that you need.

    when i ran loading multiple files into a database and then took your code to remove the files, i had no problems in my sample package. my files were loading a simple file of 10 records in one data flow and i loaded from a file with over 2 millions records in a parallel data flow to see if something else would execute ahead of when it should and it did not.

    one thing that may or may not be an issue is the first script task. what is going on in there?

  • Thanks for replying Chuck, your answer gave me a clue to solve the problem. In the first script task I had opened a StreamReader, but had forgotten to close it. Doh!

    Thanks again.

  • glad to hear you solved it...

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply