Importing CSV files without knowing destination schema

  • Hi,

    Thank you for the above script, I used the above script and it worked just fine. My question is for the part:

    --===== Query one of the files by using a four-part name.

    SELECT *

    FROM TxtSvr...Sample#txt

    What if I have more than 1 file in TxtSvr, how would we load all files in one go? Here in the above example we can do that one file at a time. Is there a way to load all files in TxtSvr under table_name in one table?

    I do apologize if I am asking questions related to very simple stuff, this is my second week in trying to do this.

  • that's actually a good question.

    It kind of depends on your data...

    I'm still playing with Jeff Moden's awesome example, but with the openrowset function, I think you need to know the actual file names...or use a text server and a cursor to get the file names, but then openrowset to open each file.

    If each file in the text server is different, or some are the same and others are not,You'll have to write something specific for each file or file type...like inserting 4 specific columns from TxtSvr...Inventory#txt, but 7 columns from TxtSvr...Accessories#txt

    If you know ALL the files have the same structure, It's not too difficult, but the only way I see to do it is with a cursor to loop thru all the filenames. That's just different than the original posters question, since you'd be importing into a table that already exists,and just need the "new" data in multiple files.

    after you've added the text server, something like this would work:

    --===== Need a temp table with all

    --the file names for a cursor to fiddle with them

    CREATE TABLE #TMP (TABLE_CAT VARCHAR(100),

    TABLE_SCHEM VARCHAR(100),

    TABLE_NAME VARCHAR(100),

    TABLE_TYPE VARCHAR(100),

    REMARKS VARCHAR(100) )

    --Load the table using the proc sp_Tables_Ex

    INSERT INTO #TMP

    EXEC dbo.sp_Tables_Ex TxtSvr

    declare

    @textTable varchar(64),

    @sql varchar(1000)

    declare c1 cursor for

    --filter the files example that meet my criteria, i.e. must have "EXP" in their names

    --so I know they came from an Export Process??

    select TABLE_NAME FROM #TMP

    --WHERE CHARINDEX('EXP',TABLE_NAME) > 1

    open c1

    fetch next from c1 into @textTable

    While @@fetch_status <> -1

    begin

    --create the table if it does not exist, where 1=2 makes the table, but no rows are inserted.

    IF NOT EXISTS (SELECT * FROM sys.objects

    WHERE object_id = OBJECT_ID(N'[dbo].[myDestinationTable]')

    AND type in (N'U'))

    BEGIN

    SET @sql = ' SELECT * INTO myDestinationTable FROM TxtSvr...' + @textTable

    + ' WHERE 1 = 2 '

    PRINT @sql

    EXEC(@sql)

    END

    --now insert the data.

    SET @sql = 'INSERT INTO myDestinationTable '

    SET @sql = @sql + ' SELECT * FROM TxtSvr...' + @textTable

    --if this table doesn't match the same schema as our

    --destination table the command will fail

    PRINT @sql

    EXEC(@sql)

    fetch next from c1 into @textTable

    end

    close c1

    deallocate c1

    rohanverma (9/3/2008)


    Hi,

    Thank you for the above script, I used the above script and it worked just fine. My question is for the part:

    --===== Query one of the files by using a four-part name.

    SELECT *

    FROM TxtSvr...Sample#txt

    What if I have more than 1 file in TxtSvr, how would we load all files in one go? Here in the above example we can do that one file at a time. Is there a way to load all files in TxtSvr under table_name in one table?

    I do apologize if I am asking questions related to very simple stuff, this is my second week in trying to do this.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • rohanverma (9/3/2008)


    Hi,

    Thank you for the above script, I used the above script and it worked just fine. My question is for the part:

    --===== Query one of the files by using a four-part name.

    SELECT *

    FROM TxtSvr...Sample#txt

    What if I have more than 1 file in TxtSvr, how would we load all files in one go? Here in the above example we can do that one file at a time. Is there a way to load all files in TxtSvr under table_name in one table?

    I do apologize if I am asking questions related to very simple stuff, this is my second week in trying to do this.

    With a little dynamic SQL on your part, if all the tables have the same structure of columns, you could end up with something like...

    SELECT * FROM TxtSvr...Sample01#txt UNION ALL

    SELECT * FROM TxtSvr...Sample02#txt UNION ALL

    SELECT * FROM TxtSvr...Sample03#txt UNION ALL

    SELECT * FROM TxtSvr...Sample04#txt UNION ALL

    SELECT * FROM TxtSvr...Sample05#txt UNION ALL

    SELECT * FROM TxtSvr...Sample06#txt

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • What if the file is either comma or pipe delimited with Quotes on fields that has embedded comma or pipe? Also, the file names will be a logging table which will be used to load the files in Sql database one by one.... I was able to do using LogParser2.2by passing filenname and table name from variables in the Procedure, but not sure how robist is that solution... text identifier ruins messes up the bulk insert I was thinking to use.

Viewing 4 posts - 16 through 18 (of 18 total)

You must be logged in to reply to this topic. Login to reply