SSIS File Rename - The process cannot access the file because it is being used by another process

  • Current setup:

    Foreach Loop container, containing the following items in sequence:

    1. Date Flow task, containing Flat File source on network share, then OLE DB destination (SQL table).

    2. Script task, which increments some variables

    3. Script task which I just added in an attempt to solve this, which simply waits for 5 seconds. Threading.Thread.Sleep(5000)

    4. File system task, which renames file (moves).

    Yesterday this process worked fine several times, but now I get "The process cannot access the file because it is being used by another process".

    I'm 100% positive that the SQL agent service account has full read/write/delete access to the file share; it always has and in fact this file share is shared almost publicly. I've also checked out the (2) text files that it is looping through, and I have no problem cutting and pasting them myself - they definitely aren't in use by any other process.

    The Foreach container is probably executing extremely fast, because at this time the files have zero records - just headers.

    What else can I do here ??

  • When I get this error it's usually because I've opened the file in Excel/NotePad/PickYourEditor that has a lock on the file.

    How are the files being created? Is the create process running too?

  • The files are being created at a single point in time, daily about 4:00 AM. I'm running my process way after that. I don't have the files open, and in fact I've ensured that I'm able to manually cut & paste them so nothing else is holding them. As far as that line of thought I feel 100% positive it's not that.

    Since the time I posted this I've tried all kinds of other things, the most recent being:

    1. script task to create a temp directory, copy all the files into the temp dir.

    2. foreach file enumerator that goes through TEMP directory instead of 'real' directory

    3. script task that renames (moves) them from the real directory to an archive.

    Just run into an error every which-way, it seems. Carrying on with my attempts...

  • Since this is out on a file share you could look at the open files in computer management connected to that server to see what has the file(s) open.

  • Thanks, good tip. I'll see if I can get "IT" to help me out in that regard. :w00t:

  • Quick thought, grab the SysInternals utility which shows open files, cannot remember the name but should be easy to find. It will show if any process is accessing the file. I've come across this problem quite few times, most of the time SSIS was the culprit, not closing the files after simple operations like row counting etc.

    😎

  • OK thanks.

    1) How would I Pause/break the SSIS package at the specific time to see what has the files open? In the middle of the Foreach file enumerator? Or right after that data flow task, and how?

    2) If I find SSIS has the files open, how do I solve this and tell SSIS to close them? Is the only solution to have multiple, separate packages?

    Goodness, this was so much easier in 100% VBA.

  • Within the SysInternals utility you can filter the output on a folder, the file handle will show up there during the Data Flow task execution. The question is whether SSIS releases the handle before the next file system operation on that file.

    😎

  • I was unable to ascertain to a certainty when the package lets the files go. Either way, the fact that they were being locked was indisputable I guess.

    Adding any one of the several creative methods to "wait" did no good either. The files were locked until the package execution was finished, it seemed.

    I eventually solved my problem like this:

    1. in the ForEach loop, after processing the file into SQL table, I simply COPIED the file...That worked, locks notwithtanding.

    2. in the same loop, execute a stored procedure to insert a value into a table saying the file was Processed, not yet Deleted.

    3. create a new package running independently from the first one (but slightly after it each morning), that opens an data flow (ado source - files not yet deleted, recordset destination), and loops through it, updating a variable and using a simple script task to delete the file. and executing a SP to say file is now deleted. (yes there is the Filesystem delete file task, but it's so buggy in terms of Source and Variables that it's easier to just write a 3-line script task..)

    Works like a champ, I guess I'm OK with this for now. Have researched this SSIS file locking thing to no end, and none of the posted solutions have worked for me...(Waits, Separating tasks/steps, Separating foreach containers, etc).....But there is a group of people who seem to have issues still and no solution; guess I'm in there with them.

  • Good stuff, glad you got this resolved.

    This is less of a problem on 2K8 R2 and later, not certain why, just my experience. As a rule of thumb, I try to use script tasks for all non-data operations rather than the File System Tasks, one can close the file for further operations without having to rely on the SSIS behaviour.

    😎

  • From what I am learning in SSIS (am pretty new), I think I agree strongly about script tasks being preferable in some things - most notably, so far, file system tasks. More detailed control it seems?

    Also thanks for both of your various tips including the SysInternal - I now have Handle.exe, very handy. 🙂

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply