• unas_sasing - Tuesday, October 17, 2017 2:42 AM

    Thom A - Tuesday, October 17, 2017 2:27 AM

    Providing a screenshot of the file names isn't really helpful. It's tells us nothing about the file's contents. Do all the files have the same format; same columns, same data types, etc? Can you provide sample file(s)?

    Do you have to do this in T-SQL? Would alternatives, such as SSIS be an option?

    Hi Thom,
    All the files have different formats, for example :

    XLcardproduct

    xlcardatm

    for SSIS, I want to build those files in foreach looping and then insert into temporary table, and then Finally insert into Real table Staging

    Pictures are also not helpful I'm afraid, we can't interact with your data.

    Will each the file always be the same. So, for example, the file xlcardatm will contain the same columns and data types? As each file is different, I don't think you need to loop through them. Looping is generally done when you have same task to complete for different items; you doing different tasks (as they have different definitions) for different files, thus each iteration would be different.

    Will the files be stored in the same folder each time, overwriting the previous files, or is it a different directory each time.

    Why do you want store them in a temporary table and then a staging table? I'm not sure that'll work with SSIS, as the temporary table will be dropped once the session is closed; thus SSIS would load the data, disconnect and your temporary table will be dropped. Load the data directly into your staging table, and then from there into your production table (doing any additional transformations).

    For SSIS, you'll need to have x flat file connections, where x is the number of different files you have. Then, you can either build a different data flow for each file, or use one data flow, and have x Sources and Destinations (which do not interact with each other). After your data flow task(s) you'll need to use a T-SQL Task to complete the movement from staging to production.

    Most likely you'll then want to use a File System Task to archive the files. How you archive depends on your set up. For example, if you want the whole process to fail if one load fails, then a single Data Flow task would be better, and then  you can use a for each loop on your File System Task. If you want each file to be run and treated seperately, independantly of the others, you'd be better having a task and data flow separate for each file; thus is one ETL process for a file fails, the others are unaffected and will still archive.

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk