• SQLCurious (4/24/2014)


    I have about 1200 sql files in one of my folders. Almost all of these files do data inserts and updates, so they should be run only once. As and when required I have manually ran around 150 of them already. Whenever I ran any of these scripts, I log that file name into a log table in my sql server including the execution time. Since running 1000+ more files takes a lot of time, I want to automate running of these files through a batch file. But I also want to filter the files that are already run.

    I am just wondering whether this is the right approach, looks to me as both error prone and very time consuming. Have you looked into implementing this in a SSIS package/framework? If the content of the files are loaded into a "command" table, a single package could run all the scripts and utilize the logging framework in SSIS.

    😎