The process cannot access the file because it is being used by another process - small files are locked a bigger one works fine

  • SQL Server 2008R2 SP2.

    I have three levels of SSIS packages :

    master

    staging

    fileprocessing

    main

    The staging package calls the fileprocessing package for each file which is found in a directory and after processing it archives them. Currently I have 3 smaller files (<1MB) and one bigger (141MB). All four files are processed correctly using StreamReader but the smaller files are kept open and cannot be archived by the staging package.

    I have tried try-catch-finally with an explicit StreamReader.Close() in the finally as well as the Using construct which should also close the file. I stepped through both options with the debugger without problems yet neither of the methods are effectively unlocking the file. Adding another script task in the staging package shows that the file is effectively locked. It's not a temporary issue, as that script tasks loops until the file becomes available, but it seems to be looping indefinitely.

    Process Explorer also shows that DtsDebugHost.exe still has a handle to the file.

    The same problem happens after deployment on the server so it doesn't seem to be related to dtsdebughost.exe.

    Any ideas?

    Hans

  • The fact that the larger files do not have the problem suggests that this is a timing issue rather than a logic/coding issue.

    You could try calling .Dispose() rather than .Close(). You could also try adding a Thread.Sleep(nnnn) to test the possibility of it being down to timing.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • The StreamReader.Close() command calls the Dispose() command, passing a value of true (the Using() construct does the same thing). I have tried calling Dispose() and while it executed successfully it didn't solve the issue.

    The Tread.Sleep is implemented in the staging package, but even after half an hour the file remains locked by DTSDebugHost.exe.

    Below you'll find two simplified versions (actual processing of the lines taken out to keep it short) :

    private void ProcessTextFile(string FileName)

    {

    var file = new System.IO.StreamReader(FileName);

    int row = 0;

    string line;

    try

    {

    while ((line = file.ReadLine()) != null)

    {

    if(!string.IsNullOrEmpty(line))

    {

    //line processing & insert into db is done here

    }

    }

    //final db insert

    }

    catch (Exception e)

    {

    throw e;

    }

    finally

    {

    //also tried file.Dispose(); here

    file.Close();

    }

    }

    private void ProcessTextFile(string FileName)

    {

    int row = 0;

    string line;

    using (var file = new System.IO.StreamReader(importFullFileName))

    {

    try

    {

    while ((line = file.ReadLine()) != null)

    {

    if(!string.IsNullOrEmpty(line))

    {

    //line processing & insert into db is done here

    }

    }

    //final db insert

    }

    catch (Exception e)

    {

    throw e;

    }

    }

    }

  • The StreamReader.Close() command calls the Dispose() command, passing a value of true (the Using() construct does the same thing). I have tried calling Dispose() and while it executed successfully it didn't solve the issue.

    Yeah, I knew that. But worth a try.

    For other readers, I've reposted the code, but with markup tags to make it easier to read.

    private void ProcessTextFile(string FileName)

    {

    var file = new System.IO.StreamReader(FileName);

    int row = 0;

    string line;

    try

    {

    while ((line = file.ReadLine()) != null)

    {

    if(!string.IsNullOrEmpty(line))

    {

    //line processing & insert into db is done here

    }

    }

    //final db insert

    }

    catch (Exception e)

    {

    throw e;

    }

    finally

    {

    //also tried file.Dispose(); here

    file.Close();

    }

    }

    private void ProcessTextFile(string FileName)

    {

    int row = 0;

    string line;

    using (var file = new System.IO.StreamReader(importFullFileName))

    {

    try

    {

    while ((line = file.ReadLine()) != null)

    {

    if(!string.IsNullOrEmpty(line))

    {

    //line processing & insert into db is done here

    }

    }

    //final db insert

    }

    catch (Exception e)

    {

    throw e;

    }

    }

    }

    Just a quick aside - may I ask why you are using streamreaders rather than standard SSIS dataflows to do file imports?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • about the StreamReaders : I inherited this project and apparently they are used here because the import is done using SqlBulkCopy which should be faster than a dataflow component especially for bigger files. The files I'm working with right now are smaller test files.

  • eylenh (6/11/2013)


    about the StreamReaders : I inherited this project and apparently they are used here because the import is done using SqlBulkCopy which should be faster than a dataflow component especially for bigger files. The files I'm working with right now are smaller test files.

    OK - I assume that appropriate testing was performed to verify that, because it surprises me.

    Have you made any progress with your file locking issue?

    If not, and you have the time/inclination to knock up a stripped-down sample package or two which I can run to reproduce the issue, I should be able to take a look later.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply