Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Create batch file to selectively run SQL files Expand / Collapse
Author
Message
Posted Thursday, April 24, 2014 2:33 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Wednesday, July 23, 2014 4:30 PM
Points: 76, Visits: 290
I have about 1200 sql files in one of my folders. Almost all of these files do data inserts and updates, so they should be run only once. As and when required I have manually ran around 150 of them already. Whenever I ran any of these scripts, I log that file name into a log table in my sql server including the execution time. Since running 1000+ more files takes a lot of time, I want to automate running of these files through a batch file. But I also want to filter the files that are already run.

My file list looks like follows.

InsertToOrderTypes.sql
UpdateClientAddress.sql
DeleteDuplicateOrders.sql
InsertToEmailAddress.sql
ConsolidateBrokerData.sql
UpdateInventory.sql
EliminateInvalidOfficeLocations.sql

My log table in the database looks like this.

select * from sqlfileexecutionlog

FileName RunTime Result
-------- ------- ------
DeleteDuplicateOrders.sql 03/12/2014 14:23:45:091 Success
UpdateInventory.sql 04/06/2014 08:44:17:176 Success

Now I want to create a batch file to run the remaining files from my directory to my sql server. I also want to wrap each of these sql file executions in a transaction and log success/failure along with the runtime and filename into sqlfileexecutionlog table. As I add new sql files into this directory, I should be able to run the same batch file and execute only the sql files that have not bee run.

It will be great if someone can give me inputs or sample code to achieve this via regular command shell as well as powershell.
Post #1564580
Posted Thursday, April 24, 2014 4:32 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 6:07 AM
Points: 1,778, Visits: 4,663
SQLCurious (4/24/2014)
I have about 1200 sql files in one of my folders. Almost all of these files do data inserts and updates, so they should be run only once. As and when required I have manually ran around 150 of them already. Whenever I ran any of these scripts, I log that file name into a log table in my sql server including the execution time. Since running 1000+ more files takes a lot of time, I want to automate running of these files through a batch file. But I also want to filter the files that are already run.


I am just wondering whether this is the right approach, looks to me as both error prone and very time consuming. Have you looked into implementing this in a SSIS package/framework? If the content of the files are loaded into a "command" table, a single package could run all the scripts and utilize the logging framework in SSIS.
Post #1564599
Posted Thursday, April 24, 2014 10:20 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Thursday, August 14, 2014 6:19 AM
Points: 100, Visits: 563
This should work if you run it from the folder where your files are, quick and dirty:

for %%R in (*.sql) do sqlcmd -i "%%R" -S {server_name} -U {user} -P {pwd} -d {db_name}


This won't stop on any errors, but you can add some handling.
Post #1564755
Posted Friday, April 25, 2014 5:22 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, June 27, 2014 6:25 AM
Points: 249, Visits: 529
Well try this one i am not sure what you want do
but to my best
Create a Dos batch file using the following batch script
ECHO :ON ERROR Exit>>"All in one Script.sql"
Echo PRINT 'Start $Sec$: ' + CONVERT(VARCHAR,GETDATE(),108)>>"All in one Script.sql"
Echo :setvar path %0>>"All in one Script.sql"
for /r . %%f in (*.sql) do call :ACTION "%%~nxf"
goto END
:ACTION
echo. >>"All in one Script.sql"
echo :r $(path)"\%~1">>"All in one Script.sql"
echo GO >>"All in one Script.sql"
echo PRINT '[%~1] Completed Successfully - $Sec$: ' + CONVERT(VARCHAR,GETDATE(),108)>>"All in one Script.sql"
echo GO >>"All in one Script.sql"
echo. >>"All in one Script.sql"
:END

the above code loop through all the sql files, if you want do any changes for a single file code use the action block

just paste the batch file inside the folder and run it or just pass the folder as a parameter

it just prepare a Sql command mode script file you need to verify it before you run it




Every rule in a world of bits and bytes, can be bend or eventually be broken
MyBlog About Common dialog control
A Visualizer for viewing SqlCommand object script
Post #1565027
Posted Monday, April 28, 2014 10:47 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Wednesday, July 23, 2014 4:30 PM
Points: 76, Visits: 290
Thanks for all your inputs and suggestions. I will work towards implementing this in SSIS which seems to have better control. I appreciate everyone's time and effort in providing valuable suggestions.
Post #1565655
Posted Monday, April 28, 2014 11:41 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 6:07 AM
Points: 1,778, Visits: 4,663
SQLCurious (4/28/2014)
Thanks for all your inputs and suggestions. I will work towards implementing this in SSIS which seems to have better control. I appreciate everyone's time and effort in providing valuable suggestions.

Let us know if you need help with the SSIS implementation, there are quite a few samples available.
Post #1565666
Posted Monday, April 28, 2014 12:15 PM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Wednesday, July 23, 2014 4:30 PM
Points: 76, Visits: 290
It will be great if you can provide me some :). I am also looking for a framework which can be generally used to logging the execution results (time taken, success, failure, rowcounts etc.,) for each of the packages executed. Once it is setup, I would like it to be reused for any additional packages I may be adding in future. I would grately appreciate if you can provide me with such examples.
Post #1565672
Posted Monday, April 28, 2014 12:42 PM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 6:07 AM
Points: 1,778, Visits: 4,663
I'll dig into it and let you know. Certain I have something and probably there are others that have similar, lets hope they share
Post #1565678
Posted Tuesday, April 29, 2014 7:59 PM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Friday, August 22, 2014 12:24 PM
Points: 2,829, Visits: 8,477
Why are you running code from files, instead of having the code in stored procedures, then executing the stored procedures. Is there an advantage to files ?


Post #1566191
Posted Monday, May 5, 2014 5:20 PM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Wednesday, July 23, 2014 4:30 PM
Points: 76, Visits: 290
These are mostly for one-time deployment. I wouldn't be reusing them at all.
Post #1567742
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse