File Delete Not Working Once Deployed

  • The following script embedded in a package works in the IDE but when deployed and run from an Agent Job no files get deleted, the same parameters are being used in both cases. The Agent Job owner has full control access rights to the folder.

    The following parameters values are used:
    pathTargetDirectory = "E:\Test Files\Processed"
    fileFilter = "InvoicedFOC*.csv"
    retentionPeriod = 30

    public void Main()
       {
        string pathTargetDirectory = Dts.Variables["$Package::pathTargetDirectory"].Value.ToString();
        string fileFilter = Dts.Variables["$Package::fileFilter"].Value.ToString();
        int retentionPeriod = int.Parse(Dts.Variables["$Package::retentionPeriod"].Value.ToString());
        DateTime fileCreatedDate;
        DateTime retentionDate;
        DateTime today = DateTime.Today;

        DirectoryInfo dirInfo = new DirectoryInfo(pathTargetDirectory);
        foreach (FileInfo f in dirInfo.GetFiles(fileFilter, SearchOption.TopDirectoryOnly))
        {
          fileCreatedDate = f.CreationTime.Date;
          retentionDate = fileCreatedDate.AddDays(retentionPeriod).Date;

          if (today > retentionDate)
          {
           try
           {
            f.Delete();
           }
           catch(Exception ex)
           {
            Console.WriteLine(ex);
           }
          }

        //MessageBox.Show(f.CreationTime.ToString());
        } 

                Dts.TaskResult = (int)ScriptResults.Success;
            }

  • The Agent job owner is not the right place to look. Instead, look at the context in which the job is being run – the SQL Agent service user, unless you have set up a proxy.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil Parkin - Thursday, August 23, 2018 5:11 AM

    The Agent job owner is not the right place to look. Instead, look at the context in which the job is being run – the SQL Agent service user, unless you have set up a proxy.

    It was permissions, I had created the original test folder some weeks ago and created another yesterday. My original tests in the IDE were using the folder created some weeks ago where permissions were set correctly. I had missed setting them in the new folder. DOH! I had added the Catch Block to allow the process to continue if an error occurred because of a file lock as I did not want such an error to stop the process for the remaining files. All is good now. I will just go lie down for a while now and contemplate life 🙂

  • tim.ffitch 25252 - Thursday, August 23, 2018 5:28 AM

    It was permissions, I had created the original test folder some weeks ago and created another yesterday. My original tests in the IDE were using the folder created some weeks ago where permissions were set correctly. I had missed setting them in the new folder. DOH! I had added the Catch Block to allow the process to continue if an error occurred because of a file lock as I did not want such an error to stop the process for the remaining files. All is good now. I will just go lie down for a while now and contemplate life 🙂

    I'm familiar with that feeling 🙂 Glad you got it sorted.

    You could easily refine your CATCH block to specifically trap the case where the file is locked and fail the job in all other cases, should you choose to do so.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply