Importing The Latest File In A Folder Via DTS

  • Comments posted to this topic are about the content posted at
    "I don't know what I don't know."

  • Nice, but I have to admit it seems a very complicated solution.  Circumstances are probably different but I used a bit of t-sql to do a very similar thing.  In my case so I could run a job that automatically identifies and then restores the latest backup of a database to a beta server.  It relies on xp_cmdshell and hardcoded folder locations which isn't great, but its fast and short. 

    I've included the complete code below, at the very least it might provide someone with an alternative means that is easily integrated into a maintenance job or stored proc.

    -- output a command shell directory listing to a text file, sorting by date

    EXEC master..xp_cmdshell 'dir "\\share\FolderOfInterest\*.bak" /b /O-D > d:\output.txt', no_output


    -- create a temporary table to store the directory listing

    CREATE TABLE #tmpDirList (strFileName varchar(100))

    -- bulk upload the directory listing to the temporary table

    BULK INSERT #tmpDirList FROM 'D:\output.txt' WITH (DATAFILETYPE = 'char', ROWTERMINATOR = '\n')

    -- create a variable to store the latest file to

    DECLARE @strFileSource varchar(100)

    SET @strFileSource = (SELECT TOP 1 strFilename FROM #tmpDirList)




  • I agree, it is a complex solution, but each to their own

    I restore a db each day using a similar trick, again using VB Script, probably because that's where my background is from.  Today (!) a workmate asked me how to do something simliar in T-SQL, and I was able to pass along your code as a good example.  Thanks

    I'll have to write up the VB Script referred to above (runs on a File Print Server with no SQL installed) as my next article.

    Dave J
    "I don't know what I don't know."

  • Hi Guys,


    I am a newbie so please forgive any ignorance to this subject. I am trying to Import the latest file from a separate server. I have tried to use the process doc provided and it work up to a point.

    Each day a flat file is ftp'd into a folder. The file contains 8 days worth of records each day. This allows for any changes made after the data was originally created. I think I will need to add an sql script which would allow me to do this.

    Do you have any ideas? I have only just started to learn sql and am afraid it may be out of my depth at the moment.

    Thanks in Advance.




  • not knowing much more about your scenario I would simply suggest if its an ftp location then use a DTS package.  its FTP object is very easy to work with for retrieving files.  if there's a server share instead then the T-SQL script may be slightly easier.  if the flat file is just rows of records then running a bulk insert statement from either a DTS package or a T-SQL script should be a piece of cake.


  • Hi Newbie,


    Thanks for the information. I decided to use a DTS package and it is working except for one issue. I only want it to pick up the file for the latest date ie todays date. Right now it will still pick up the file with the latest date on it.

    I am by no mean a VB expert but I think it has something to do with the below line of code.


    If DTSGlobalVariables("FileName").Value <>  "" Then

            Main = DTSTaskExecResult_Success


            Main = DTSTaskExecResult_Failure

        End If


    Any suggestions.


  • Replace the code in the  ActiveX Script with this:

    '  Visual Basic ActiveX Script
    Function Main()
        Dim filePreFix, folderName, fileName
        folderName = DTSGlobalVariables("FileFolder").Value
        filePreFix = DTSGlobalVariables("FilePrefix").Value
        'Call the function, passing in the two Global Variables obtained above.
        fileName = TodaysFile(folderName, filePreFix)
        DTSGlobalVariables("FileName").Value = fileName
        If DTSGlobalVariables("FileName").Value <>  "" Then
            Main = DTSTaskExecResult_Success
            Main = DTSTaskExecResult_Failure
        End If
    End Function
    Function TodaysFile(MyFolderName, MyFilePrefix)
        Dim myFSO, HighestDate, MyResultName, myFolder, file
        Set myFSO = CreateObject("Scripting.FileSystemObject")
        Set myFolder = myFSO.GetFolder(MyFolderName)
        MyResultName = ""
        HighestDate = dateAdd("s",-1,Date & " 00:00:00")
            For Each file In myFolder.files
            'Check to make sure the file starts with the right stuff
            If UCase(Left(,Len(MyFilePrefix))) = UCase(MyFilePrefix) Then
                'Check last modified date
                If file.DateLastModified > HighestDate Then
                    MyResultName =  file.path
                    HighestDate = file.DateLastModified
                End If
            End If
        TodaysFile = MyResultName
        Set myFSO = Nothing
    End Function


    Dave Jackson
    "I don't know what I don't know."

  • Hi, this is a great article, thank you!!!

    But is there anyway possible that you can do something with your screanshots so they could be viewed bigger. I would really like to take a look at all the tools that you are using there.

    Thank you,

    Vicky 😀

    Just to add, in case anyone is interested, (hint hint ;)), what I am trying to do is import data from a table into a file, which is easily done in DTS. My problem is that this transfer is supposed to take place every day and the file is supposed to be placed into a new folder generated each day. That is where I am stuck - creating that new folder in the same location each day and giving it a name increased by one from the already existing folder in that location. Ex.: fldrName1, next day - fldrName2, etc.

    I have tried creating a Global Variable saved in the packet properties, but from there I am having a lot of trouble. If anyone can and is willing to help me out on this issue I would greatly greatly appreciate it.

    Thanks again!

  • Regarding the larger images, they used to work but the site upgrade appears to have killed them off. They are just standard DTS screenshots however. You can see more at my site where I reproduced the article.

    Regarding naming a new folder every day, the following should help


    create some test folders

    exec xp_cmdshell 'Md C:\TestDTS\Folder0001'

    exec xp_cmdshell 'Md C:\TestDTS\Folder0002'

    exec xp_cmdshell 'Md C:\TestDTS\Folder0003'

    exec xp_cmdshell 'Md C:\TestDTS\Folder0004'

    exec xp_cmdshell 'Md C:\TestDTS\Folder0005'

    --try some later ones

    exec xp_cmdshell 'Md C:\TestDTS\Folder0023'

    exec xp_cmdshell 'Md C:\TestDTS\Folder0024'

    exec xp_cmdshell 'Md C:\TestDTS\Folder0025'

    --and again

    exec xp_cmdshell 'Md C:\TestDTS\Folder0323'

    exec xp_cmdshell 'Md C:\TestDTS\Folder1024'

    exec xp_cmdshell 'Md C:\TestDTS\Folder5425'


    If Object_id('tempdb..#t1') > 0

    Drop Table #t1

    Create Table #t1 (

    Outp Varchar(255))

    Declare @path Varchar(255),

    @folderPrefix Varchar(255),

    @cmd Varchar(512)

    Select @path = 'C:\TestDTS\'

    Select @folderPrefix = 'Folder'

    Select @cmd = 'dir ' + @path + @folderPrefix + '* /O:n /b' -- order by name, only return filenames

    Insert #t1

    Exec Xp_cmdshell


    Delete From #t1

    Where Outp Is Null -- get rid of the blank row

    Declare @num Varchar(4)

    Select *

    From #t1

    Select @num = Right(Outp,4)

    From #t1 -- recover last number

    Select @num As [last known folder name]--show it

    Declare @next Varchar(4)

    Select @num = @num + 1

    Select @next = Right('000' + @num,4)

    Select @next As [next known folder name]--show it

    Select @cmd = 'md ' + @path + @folderPrefix + @next

    Select @cmd As [cmd to run]

    Exec Xp_cmdshell


    --You can then do something like

    Declare @outputFile Varchar(255)

    Select Outputfile = @path + 'OutFile.csv'

    Select @cmd = 'Move ' + @path + 'OutFile.csv ' + @path + @folderPrefix + @next

    Select @cmd

    --exec xp_cmdshell @cmd

    This assumes the root folder is known, 'C:\TestDTS\ in this example and you have created a file in said folder named 'OutFile.csv' The final (commented out) xp_cmdshell call should move it into the newly created numbrered folder for you.

    Paste all of the above into a SQL Task called after you have created the file, edit to suit your environment and you should be good to go.

    Note that

    I haven't tested this fully

    The numbers are left padded with zeroes to sort properly

    This should last at one folder per day for just over 27 years. 🙂


    Dave J
    "I don't know what I don't know."

  • Hi! Very good solution and easy to follow! 🙂

    In fact, i was looking for something like this, but, the difference is that i want to import ALL the files in that folder, not only the latest. Also, i want to execute the DTS every hour, and i will import the files only if the files had change. is it neccesary to create a log file to manage this?How can do this DTS recursive????:ermm:

    ie: foldertest\file1.txt


    These two files are going to be updated automatically from other solution. Sometimes can be just one file, others two in the 'foldertest'. But i guess this is not that important, since we read all the files in the folder. right?

    Any help is wellcome! Thanks a lot!

  • I would do this by iterating through the files collection of the folder object, and moving them after processing into a 'Processed' Folder. Is that possible?

    If none of the above makes sense, reply and I'll see if I can code it up for you.

    Dave J
    "I don't know what I don't know."

  • I guess if you write and example it will be more understandable for me or anyone else who read this forum.

    thanks a lot, have a good day!

  • First, sorry this has taken a while to get back to you and post.

    Second, this is a function taken from a vbs script, but it should work in a DTS package ActiveX step, though I haven't tested it. (Any references to WScript will need replacing, use Global Variables in the DTS Package).

    Thirdly, it's not complete, but hopefully will be clear enough so as to be adaptable to do what you want to do.

    Lastly, it is a recursive function. It's power is in it calls itself.

    Dim oShell, oFS, sScript, sScriptPath

    'Create a Shell and FileSystem Object - always useful :)

    Set oShell = WScript.CreateObject("WScript.Shell")

    Set oFS = CreateObject("Scripting.FileSystemObject")

    sScript = Wscript.ScriptFullName 'Name, including the path, of this script

    sScriptPath = oFS.GetAbsolutePathName(sScript & "\..") 'Path to this script

    folderPath = "C:\TestTiffFiles"' will need changing (Global Variable?)

    FileToLogTo = "ProcessedFiles.log"' as will this

    LogAction "Started processing from " & folderPath 'call to a sub routine, see below

    If (oFS.FolderExists(folderPath)) Then

    ForEachFolder folderPath, True 'call to the recursive function

    End If

    ' End of script


    Private Sub ForEachFolder(ByVal Source_Folder, doSubFolders)


    'The second argument allow you not to use the recursivity.

    Dim fso, f, file, fileCollection, prefix, ext, pos

    Dim oFolder, aFolders, FolderCollection

    Dim sFilename, oFile

    Set aFolders = oFS.GetFolder(Source_Folder)

    Set FolderCollection = aFolders.SubFolders

    'will not do any files in any folders that contain the word Processed

    pos = InStr(1,Source_Folder, "Processed",1) 'case in-sensitive match

    If pos > 0 then

    'LogAction "NOT processing from " & Source_Folder

    Exit Sub

    end if

    Set f = oFS.GetFolder(Source_Folder)

    Set fileCollection = f.Files

    For Each file in fileCollection

    ' Do something with each file here by passing the file object to a sub

    ProcessFile file


    If doSubFolders Then ' This bit calls this sub for each folder in the root folder

    For Each oFolder in FolderCollection

    Call ForEachFolder(oFS.GetAbsolutePathName(Source_Folder & "\" & oFolder.Name&"\"),doSubFolders)


    End If

    Set oFile = Nothing

    Set aFolders = Nothing

    Set FolderCollection = Nothing

    End Sub


    Sub ProcessFile (file)


    'place holder sub, place your processing for each file here

    'After processing, use the oFS (filesystem) object to move the file somewhere you won't process again (Processed Folder)

    End Sub


    Sub LogAction (Entry) 'Helper Sub


    Dim f, LogFile

    'msgbox Entry

    LogFile = ExecutingFrom & fileToLogTo

    Const ForReading = 1, ForWriting = 2, ForAppending = 8

    On Error Resume Next

    set f = oFS.OpenTextFile(LogFile, ForAppending, True)

    f.WriteLine "[ " & Now & " ] - " & Entry


    On Error Goto 0

    set f = Nothing

    End Sub


    Function ExecutingFrom() 'Helper Function


    Dim ScriptPath

    ScriptPath=Left(wscript.scriptfullname, _

    Len(wscript.scriptfullname)-Len(wscript.scriptname)) ''will need changing to a global variaqble

    If Right(ScriptPath,1) <> "\" Then

    ScriptPath = ScriptPath & "\"

    End If

    ExecutingFrom = ScriptPath

    End Function


    Dave J
    "I don't know what I don't know."

  • Good sugestion! i already did that! Thanks a lot! have a nice day!:)

  • Anyone got a way to get the last excel file using SSIS? I currently have a job that gets the last file in a folder like: file003 dated 09/03/2008

    file002 dated 09/02/2008

    file001 dared 09/01/2008

    I need file003 to update table in DB.

    I used this in SQL2000 as DTS and tried to convert it but didn't like the way it did it and would like to do it cleaner in SSIS


Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply