SQLServerCentral Article

Executing a script from SQL Server

,

Executing a script from SQL Server

I was recently asked how a script could be scheduled within

SQL Server. This is actually a fairly easy thing to do and something I use

quite often, so I was surprised at the question. So I asked a few colleagues

how they would do it and was surprised to find a number of people that did not

know how this could be done. So here I am, writing a quick tidbit on scheduling

scripts. This information can be used to schedule or just setup a repeatable

process that you execute on demand.

There are basically two methods for scheduling a script I

will show here and discuss below: Using xp_cmdshell and transforming the

script.

xp_cmdshell - A quick look

Essentially creates a command prompt (or MS-DOS Prompt) and

then runs you commands as though they were typed in this window. The syntax is:

Exec master..xp_cmdshell '<command>’

Where

  <command> is the command you want to execute.

Here is an example that will execute the script

c:\listtables.sql.

exec master..xp_cmdshell 'isql -Sdev_sjones -E -i"listtables.sql" –diqd'

which produces this output:

name
--------------------------------------------------
Cat
Product
ProdCat
ProductSKU
ProductBundle 
Member
dtproperties 
(7 rows affected)

(I removed some of the spacing for brevity)

See the detailed discussion below for more information about

this process and how to develop a solution using this technique.

Transforming The Script

Any script can be transformed to a script with the addition

of the following code:

Create procedure<object_name>
as
<include the script here>

Here is a quick example: suppose you had a script to add a

new user to the server. You might have a script like the following:

exec sp_addlogin 'steve', 'longpassword', 'dev'
exec sp_adduser 'steve', 'steve', 'public'
go

You could easily turn this into a procedure that looks like

this:

Create procedure dbsp_adduser
as
   exec sp_addlogin 'steve’, 'longpassword’, 'dev’
   exec sp_adduser 'steve’, 'steve’, 'public'
return

Admittedly, not a very useful procedure for the future, but this turns

the script into a stored procedure that can be executed by another procedure,

the SQL Agent, or any front end tool of choice. You can even send parameters

into the stored procedure to allow the script to change to meet changing needs.

This method starts to turn your scripting into true development, but gives a great

deal of added flexibility. Most all scripts can be easily turned into stored

procedures and if you have any difficulty, please post your script in our

forums

or use the "Your Opinion" button below

and I will be more than happy to assist you in transforming your script.

xp_cmdshell – Detailed Discussion

The xp_cmdshell is a very powerful tool; one that I use

quite often for a variety of purposes. I had expected that most DBAs would, but

when I asked, very few did. An example of how I use this would be to get a

folder listing of the root of the C: drive. At a command prompt, you would type

“dir c:” and receive something like this:

C:\>dir
Volume in drive C has no label.
Volume Serial Number is 07D0-0419
Directory of C:\
07/20/2000   02:20p        <DIR>            Inetpub 
07/20/2000   04:31p               735             STATS.LOG
07/21/2000   02:03p        <DIR>            download
04/25/2000   01:56a        <DIR>            BACKUP
07/21/2000   02:17p        <DIR>            My Download Files
04/25/2000   02:00a        <DIR>            I386
07/31/2000   10:10a        <DIR>            Windows Update Setup Files
04/25/2000   02:05a        <DIR>            WINNT
04/25/2000   02:08a        <DIR>            Documents and Settings
04/25/2000   02:08a        <DIR>            Program Files
04/27/2000   05:11p          4,569,727            drwtsn32.log  

2 File(s)       4,570,462 bytes 19 Dir(s)   11,774,164,992 bytes free

I deleted a few items from the listing, but this is basically

what would be returned. To run this from within SQL Server, you would type:

exec master..xp_cmdshell 'dir c:'

and you would receive a single column result set with the

same output as above. When would I use this query? When making a database and

not sitting at the console of the server. This gives me a quick glance of the

disk space available on each drive and which files are there.

I also use this for items like DBCC where I want to schedule

the command, but do not want it to run from my console and need to capture the

output. I also use this for network connectivity. With some knowledge of NT/2000

command line utilities, this can be an extremely powerful tool for a DBA. Even

though the need to do this has diminished in SQL v7.x, to demonstrate this

utility, let’s look at the DBCC example.

Scheduled DBCC

One thing that I like to run to check on tables is DBCC

SHOWCONTIG, which gives me information on the fragmentation of a table.

However, I like to schedule things to run by themselves periodically and then

notify me when they are done rather than trusting myself to remember to run

them.

The DBCC command that I most often run is:

DBCC SHOWCONTIG(962102468)   

where the number is the object_id of one of my tables. This

command (prior to SQL 2000) requires the object_id rather than the name of the

table, so scripting this makes it much easier for me to get the data I want

about the table. The script is below that I use to generate this query.

Declare
    @cmd char( 200)
    , @I int
select @I = object_id('member')
Select @cmd = 'dbcc showcontig( ' + rtrim( convert(varchar( 30), @I)) + ')'  

This will give me the DBCC command stored in the @cmd variable. If you

print this variable at this point, then the output is the valid SQL command

shown above. I now place this in a file with the extension *.sql. This file

must be text format, not Word or Rich Text, but plain text. I called my file

showcontig.sql. It’s contents are a single line of text

DBCC SHOWCONTIG (962102468)   

Now I can create more statements for other tables and add them

to the showcontig.sqlfile as well until my file looks like this:

DBCC SHOWCONTIG (962102468)
DBCC SHOWCONTIG (773577794)
DBCC SHOWCONTIG (565577053)

This is a valid script that I could open in Query Analyzer and execute,

but now I create an xp_cmdshell command to run this script.

Exec master..xp_cmdshell 'isql –Sdev_sjones –Usa –E –Q “c:\showcontig.sql” >showcontig.txt’

The statement that will be executed at the command line is a valid

command line statement that will connect to me server (dev_sjones) and execute

a query (DBCC SHOWCONTIG) against the default database. I usually have a few

more options to this isql statement, but for security reasons I have omitted

them. It is important to note that I have specified the path for the

showcontig.sql file that will be executed. You should specify the full path to

the file on the SERVER, not on your local workstation. This will actually be

run on the SQL Server as if you were sitting at the console.

Of course, this is not a very maintainable or efficient way

to run my DBCC commands, so now I will look at the method I actually use to

schedule these commands.

Transforming Scripts – Detailed Discussion

In the example above, I used a short script to generate

T-SQL commands that I placed in an .sql file for execution using ISQL.EXE. Now

I will take this a step further and generate the ISQL commands dynamically and

save the results. I will present the script first and then explain it.

Declare
@cmd char( 200),
    @I int,
@qry varchar( 250)
select @I = object_id('member')
Select @cmd = 'dbcc showcontig( ' + rtrim( convert( varchar( 30), @I)) + ')'
Select @qry = 'isql –Sdev_sjones –Usa –E –Q “' + rtrim( @cmd) + '” >showcontig.txt'
Exec master..xp_cmdshell    @qry
select @I = object_id('order')
Select @cmd = 'dbcc showcontig( ' + rtrim( convert( varchar( 30), @I)) + ')' 
Select @qry = 'isql –Sdev_sjones –Usa –E –Q “' + rtrim( @cmd) + '” >>showcontig.txt'
Exec master..xp_cmdshell    @qry
select @I = object_id('lineitem')
Select @cmd = 'dbcc showcontig( ' + rtrim( convert( varchar( 30), @I)) + ')'
Select @qry = 'isql –Sdev_sjones –Usa –E –Q “' + rtrim( @cmd) + '” >>showcontig.txt'
Exec master..xp_cmdshell    @qry

This script is very similar to the previous one, except now

I store the entire ISQL.EXE command in a variable @qry. If you replace the exec

master..xp_cmdshell with the print command, then you will see the

following output:

isql –Sdev_sjones –Usa –E –Q “dbcc showcontig( 962102468)” > showcontig.txt
isql –Sdev_sjones –Usa –E –Q “dbcc showcontig( 565577053)” >> showcontig.txt
isql –Sdev_sjones –Usa –E –Q “dbcc showcontig( 789577851)” >> showcontig.txt

Notice that after the first statement I change the

redirector > to be an >> so the output will be appended to the file

rather than overwriting the file. If I type this output into a command window

and run each line, I will get the output from the DBCC commands in a single

file (showcontig.txt) that I can then examine.

To transform this script, I add the following line to the

top of the script:

Create procedure dbsp_dbcc_showcontig as

And a return at the bottom. This gives me a stored

procedure that I can schedule using SQLAgent to run on a periodic basis. The

actual procedure that I use has a few more tweaks (like a cursor to get all

table names) and my scheduled job has a second step to email the showcontig.txt

file to the DBA group after each execution.

Conclusion

These two techniques can be used in a myriad of ways to make

administering your SQL Server easier. One could take any batch that is used by

a DBA and schedule it or transform it into a repeatable process. The key to

becoming a more efficient DBA ( IT professional in general) is to find those

processes that you perform over and over and make them easier. Or make them

easy enough to push out to a power user who can perform them without requiring

the full knowledge of the DBA.

Once you can create repeatable processes, you will make your

job easier, reduce mistakes and typographical errors, and create more time to

be proactive in administering your databases. I hope these two techniques will

help you in achieving your objectives and as always, please let me know how you

use these techniques in your daily work.

As always, I welcome feedback using the "Your Opinion" button below. Please use this for questions

or comments as opposed to emailing me.

Steve Jones

©dkRanch.net September 2001

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating