Load text files in a directory into a table from SQL 2000

  • I am trying to load all text files in a directory (C:\data\textfiles.txt, textfiles2.txt ..ect) into a table in sql server 2000. I know I can set up a DTS package to import text file, but not sure how do it with multiple text file. Does anyone have a sample code to perform this tasK?

    Any help and suggestion would be appreciated

    Tammy

  • Are all of the files in the same format and will they be loaded into the same table?

    If so, you could create 1 data transformation task then in an Active X Script task use the FileSystem object to loop through each file in the folder, updating the source file location and executing the transformation.

    Function Main()

    Dim objPackage

    Dim objFSO

    Dim objFolder

    Dim objFile

    Dim objSourceFile

    Dim objTransformation

    Dim strFileName

    Set objFSO = CreateObject("Scripting.FileSystemObject")

    Set objPackage = DTSGlobalVariables.Parent

    Set objSourceFile = objPackage.Connections("The name of your source file object")

    Set objTransformation = objPackage.Steps("The name of your data pump task")

    Set objFile = objFSO.GetFolder("Path to folder containing files")

    For Each objFile In objFolder.Files

    If objFile.Size > 0 Then

    objSourceFile.DataSource = objFile.Path

    objTransformation.Execute

    End If

    Next

    Set objTransformation = Nothing

    Set objSourceFile = Nothing

    Set objFile = Nothing

    Set objFolder = Nothing

    Set objFSO = Nothing

    Set objPackage = Nothing

    Main = DTSTaskExecResult_Success

    End Function

  • thanks very much Erik

    Yes all of the files in the same format ..

    I would like to merge data from all of the files into one table. do you have any sugguestion or sample ? Please advise ...I am trying to work for this problem ...:blush:

    Erik Kutzler (11/2/2007)


    Are all of the files in the same format and will they be loaded into the same table?

    If so, you could create 1 data transformation task then in an Active X Script task use the FileSystem object to loop through each file in the folder, updating the source file location and executing the transformation.

    Function Main()

    Dim objPackage

    Dim objFSO

    Dim objFolder

    Dim objFile

    Dim objSourceFile

    Dim objTransformation

    Dim strFileName

    Set objFSO = CreateObject("Scripting.FileSystemObject")

    Set objPackage = DTSGlobalVariables.Parent

    Set objSourceFile = objPackage.Connections("The name of your source file object")

    Set objTransformation = objPackage.Steps("The name of your data pump task")

    Set objFile = objFSO.GetFolder("Path to folder containing files")

    For Each objFile In objFolder.Files

    If objFile.Size > 0 Then

    objSourceFile.DataSource = objFile.Path

    objTransformation.Execute

    End If

    Next

    Set objTransformation = Nothing

    Set objSourceFile = Nothing

    Set objFile = Nothing

    Set objFolder = Nothing

    Set objFSO = Nothing

    Set objPackage = Nothing

    Main = DTSTaskExecResult_Success

    End Function

  • What if the textfiles have different formats? Do I have to create a separate task for each? TIA

  • If the files are in different formats, yes you'll have to create a separate transformation for each.

  • I used to load multiple text files with different format into different tables in a database.

    I created a table called TextFileInfo with TextfileName, Path directory, formatfilename with path directory,

    table name.

    I created a Procedure to read this table and run the DTS package with textfilename, formatfilename, tablename as the parameters. In the DTS package, those would be in the global variable and I used ActiveX script just liked the one you had to handle the changes.

  • Yes all of the files in the same format ..

    I would like to merge data from all of the files into one table. do you have any sugguestion or sample ?

    Shoot... instead of messing around with "merge" code, it's easier and, many times, faster just to throw it all into a nice big staging table and delete the dupes...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Loner (11/6/2007)


    I used to load multiple text files with different format into different tables in a database.

    I created a table called TextFileInfo with TextfileName, Path directory, formatfilename with path directory,

    table name.

    I created a Procedure to read this table and run the DTS package with textfilename, formatfilename, tablename as the parameters. In the DTS package, those would be in the global variable and I used ActiveX script just liked the one you had to handle the changes.

    Nicely done, Loner...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Just a thought, folks... Much like Loner's process, I do a lot of ETL using Bulk Insert with format files. The difference is that I have all the files with the same formats in the same sub-folder... if I have 5 different formats, I'll have 5 different sub-folders. Those sub-folders are also where I keep the BCP Format File for each (when one iss required).

    Like Loner's example, I have a "control table" that has the path name for the files to be imported and the name of the format file, too.

    How do I get the file names? Despite being an "undocumented feature", this works in 2k, 2k5, and (reportedly) 2k8...

    DECLARE @Path VARCHAR(300)

    SET @Path = 'C:\'

    CREATE TABLE #DirListing

    (

    ListingName VARCHAR(300),

    Depth INT,

    IsFile INT

    )

    INSERT INTO #DirListing (ListingName,Depth,IsFile)

    EXEC Master.dbo.xp_DirTree @Path,1,1

    SELECT RowNum = IDENTITY(INT,1,1),

    FileName = ListingName,

    Processed = 'N'

    INTO #FileInfo

    FROM #DirListing

    WHERE IsFile = 1

    ORDER BY ListingName

    SELECT * FROM #FileInfo

    ---- add a nice tight little WHILE LOOP here to Bulk Insert the files ----

    Of course, there's other processing that you may want to do... for example, you may want to use some CmdShell directives to move the processed files to a "Processed" folder and the like.

    Of, you can do it in DTS... I just don't because of the size of the files I import... I need all the speed I can get out of an import and nothing is faster than Bulk Insert.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 9 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply