Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

The process cannot access the file because it is being used by another process - small files are locked a bigger one works fine Expand / Collapse
Author
Message
Posted Monday, June 10, 2013 7:37 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Tuesday, April 08, 2014 7:16 AM
Points: 50, Visits: 616
SQL Server 2008R2 SP2.

I have three levels of SSIS packages :

master
staging
fileprocessing
main

The staging package calls the fileprocessing package for each file which is found in a directory and after processing it archives them. Currently I have 3 smaller files (<1MB) and one bigger (141MB). All four files are processed correctly using StreamReader but the smaller files are kept open and cannot be archived by the staging package.

I have tried try-catch-finally with an explicit StreamReader.Close() in the finally as well as the Using construct which should also close the file. I stepped through both options with the debugger without problems yet neither of the methods are effectively unlocking the file. Adding another script task in the staging package shows that the file is effectively locked. It's not a temporary issue, as that script tasks loops until the file becomes available, but it seems to be looping indefinitely.

Process Explorer also shows that DtsDebugHost.exe still has a handle to the file.

The same problem happens after deployment on the server so it doesn't seem to be related to dtsdebughost.exe.

Any ideas?

Hans






Post #1461553
Posted Monday, June 10, 2013 8:27 AM


SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Today @ 4:41 AM
Points: 4,828, Visits: 11,180
The fact that the larger files do not have the problem suggests that this is a timing issue rather than a logic/coding issue.

You could try calling .Dispose() rather than .Close(). You could also try adding a Thread.Sleep(nnnn) to test the possibility of it being down to timing.



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1461581
Posted Monday, June 10, 2013 12:53 PM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Tuesday, April 08, 2014 7:16 AM
Points: 50, Visits: 616
The StreamReader.Close() command calls the Dispose() command, passing a value of true (the Using() construct does the same thing). I have tried calling Dispose() and while it executed successfully it didn't solve the issue.

The Tread.Sleep is implemented in the staging package, but even after half an hour the file remains locked by DTSDebugHost.exe.

Below you'll find two simplified versions (actual processing of the lines taken out to keep it short) :

private void ProcessTextFile(string FileName)
{
var file = new System.IO.StreamReader(FileName);
int row = 0;
string line;

try
{
while ((line = file.ReadLine()) != null)
{
if(!string.IsNullOrEmpty(line))
{
//line processing & insert into db is done here
}
}

//final db insert
}
catch (Exception e)
{
throw e;
}
finally
{
//also tried file.Dispose(); here
file.Close();
}
}


private void ProcessTextFile(string FileName)
{
int row = 0;
string line;

using (var file = new System.IO.StreamReader(importFullFileName))
{
try
{
while ((line = file.ReadLine()) != null)
{
if(!string.IsNullOrEmpty(line))
{
//line processing & insert into db is done here
}
}

//final db insert
}
catch (Exception e)
{
throw e;
}
}
}



Post #1461724
Posted Monday, June 10, 2013 1:32 PM


SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Today @ 4:41 AM
Points: 4,828, Visits: 11,180
The StreamReader.Close() command calls the Dispose() command, passing a value of true (the Using() construct does the same thing). I have tried calling Dispose() and while it executed successfully it didn't solve the issue.


Yeah, I knew that. But worth a try.

For other readers, I've reposted the code, but with markup tags to make it easier to read.

private void ProcessTextFile(string FileName)
{
var file = new System.IO.StreamReader(FileName);
int row = 0;
string line;

try
{
while ((line = file.ReadLine()) != null)
{
if(!string.IsNullOrEmpty(line))
{
//line processing & insert into db is done here
}
}

//final db insert
}
catch (Exception e)
{
throw e;
}
finally
{
//also tried file.Dispose(); here
file.Close();
}
}


private void ProcessTextFile(string FileName)
{
int row = 0;
string line;

using (var file = new System.IO.StreamReader(importFullFileName))
{
try
{
while ((line = file.ReadLine()) != null)
{
if(!string.IsNullOrEmpty(line))
{
//line processing & insert into db is done here
}
}

//final db insert
}
catch (Exception e)
{
throw e;
}
}
}

Just a quick aside - may I ask why you are using streamreaders rather than standard SSIS dataflows to do file imports?



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1461753
Posted Tuesday, June 11, 2013 12:10 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Tuesday, April 08, 2014 7:16 AM
Points: 50, Visits: 616
about the StreamReaders : I inherited this project and apparently they are used here because the import is done using SqlBulkCopy which should be faster than a dataflow component especially for bigger files. The files I'm working with right now are smaller test files.




Post #1461871
Posted Tuesday, June 11, 2013 12:27 AM


SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Today @ 4:41 AM
Points: 4,828, Visits: 11,180
eylenh (6/11/2013)
about the StreamReaders : I inherited this project and apparently they are used here because the import is done using SqlBulkCopy which should be faster than a dataflow component especially for bigger files. The files I'm working with right now are smaller test files.



OK - I assume that appropriate testing was performed to verify that, because it surprises me.

Have you made any progress with your file locking issue?

If not, and you have the time/inclination to knock up a stripped-down sample package or two which I can run to reproduce the issue, I should be able to take a look later.



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1461880
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse