March 11, 2004 at 8:40 am
Currently we have about 10 SQL Servers in prod and am using SQL Server Agent jobs to run backups, integrity checks etc.. We recently purchased BMCs Control-M non specific platform scheduling package to schedule jobs via mainframe, AIX, Windows.... My boss was asking me if it makes sense to switch the SQL Server jobs to this or leave it alone.
Does anyone use anything other than SQL Server Agent Jobs to run stuff?
Any good reasons to switch? I can't think of anything? Reasons to not switch?
I cannot figure out HOW I would script a DTS package to a file that this scheduling package use it to run. I also don't really want to script out all of the jobs to a file for it to run either.....
Any ideas ?!?!?!?
March 12, 2004 at 10:42 am
The best I can offer is - I personally LIKE using SQL Agent to schedule SQL jobs. It's designed for that purpose, and it works very well (as long as Agent isn't brought down without your knowledge). I don't have experience with 3rd party schedulers, but I have had bad experiences with 3rd party backup software. It was enough to eliminate pressures to migrate away from the SQL provided tools. My take - If it ain't broke, don't fix it!
Steve
March 12, 2004 at 11:28 am
Hi, my friend, I work in a Venezuelan Steel Company, and we have many SQL servers in Prod and we use and External scheduler to scheduling SQL, windows and Mainframe jobs it works very well together. My take is try and use it, remenber that informatic people must to learn anything you can...
PD. Excuse me my english...
March 12, 2004 at 1:50 pm
So did you write scripts to backup all your SQL Server dbs and transaction logs the? How do you have it automatically delete old ones after x days...???
March 15, 2004 at 2:10 pm
I used to work in an environment which used three different scheduling systems.
I did not convert my SQL Jobs to either run on the mainframe nor through a 3rd party network scheduling system.
My SQL jobs contain "alerts" and e-mail notfications.
I suggest you go with whatever seems to make the most sense.
I don't know that the following is true, but you can consider it.
If you script all your SQL Server jobs for other job schedulers, then service packs for SQL Server or version upgrades could impact existing jobs. I have found when applying service packs, etc. that SQL Server applies the necessary maintenance to keep these jobs from breaking.
SOooo, what works now does not require manually intervention after the SP or upgrade to continue working.
Anyone else care to chime in on their experiences?
GaryA
March 15, 2004 at 2:14 pm
Oh, I never thought of that.... and SQL Server upgrades too... wow, that is big.....
Thanks !
March 15, 2004 at 5:50 pm
Markus,
I'm sorry that I haven't responded in a more timely manner. I just checked my e-mail and saw that there were several new responses. In my situation, I do have stored procedures to handle the backups, and also one that handles deleting the old backup files, and these are scheduled through SQL Agent. I'm not at work right now, but I'll try to post these tomorrow.
Steve
March 16, 2004 at 7:24 am
Markus,
Here's the script to create the full backup stored procedure that I use. With only slight modification, it can be used for differential and transaction log backups as well.
Steve
create proc af_db_backup_full
@path1 varchar(50),
@dbname varchar(50),
@retention int
as
/*************************************************************/
--
-- FILENAME FORMAT-- dbname_type_ret_datetime.bak
--
-- Module Name: af_db_backup_full
--
-- Description: Receives 3 parms.
-- @path1 : where the backup is to be written
-- @dbname: database to backup
-- @retention: # of days to retain backup
--
-- Written By: Mike Taylor & Steve Phelps
--
-- Date: May 11, 1999 (rewritten 10/18/01 by Steve Phelps)
--
-- Modified : Removed all e-mail logic and renamed.
-- Date: Aug 19, 2002
--
-- Modified : added retention logic.
-- Date: Sep 24, 2002
--
-- Modified : modified date/time and retention logic.
-- removed @date parm and made local variable.
-- Date: Jan 13, 2003
--
-- USAGE:
-- Declare @db_name varchar(50),
-- @path varchar(50),
-- @retention int
--
-- select @dbname = 'PUBS'
-- select @path = 'e:\SQL_Backups\'
-- select @retention = 7 -- use 99 for permanent retention
--
-- exec af_db_backup_full
-- @path,
-- @dbname,
-- @retention
/*************************************************************/
-- the following declare must be removed. its here for testing.
-- declare
-- @path1 varchar(50),
-- @dbname varchar(50),
-- @retention int
-- select @dbname = 'PUBS', @path1 = 'D:\db_backup\', @retention = 12
Declare @path varchar(100),
@date datetime,
@date2 varchar(50),
@tim varchar(10),
@query_string varchar(200),
@ret_char char(2)
select @date = getdate()
select @ret_char =
CASE
WHEN @retention IS NULL THEN '00'
WHEN @retention < 0 THEN '00'
WHEN @retention < 10 THEN '0' + ltrim(convert(char(2), @retention))
WHEN @retention > 99 THEN '99'
ELSE convert(char(2), @retention)
END
-- format the date/time stamp for the filename.
select @date2 = convert(char(8),@date,112)
select @tim-2 = convert(char(10),@date,108)
select @tim-2 = replace(@tim,':','')
select @date2 = @date2 + rtrim(@tim)
select @path = @path1 + @dbname + '_full_' + @ret_char + '_' + @date2 + '.bak'
select @query_string = 'BACKUP DATABASE ' + @dbname + ' to Disk = ''' + @path + ''' with init'
--following select for testing only
--select @query_string
exec ( @query_string )
-- BACKUP DATABASE PUBS to Disk = 'F:\db_backup\PUBS_full_07_20011018114047.bak' with init
GO
March 16, 2004 at 7:28 am
Here's my script for deleting the old backups (Note that both of these were written by a former colleague of mine, and extensively modified by me.)
Create Procedure af_Delete_Old_Backups (@path varchar(255))
as
/********************************************************************
*****
***** FILENAME FORMAT-- dbname_type_datetime.bak
*****
***** Use p_delete_old_backups if the filename has the
***** old format (dbname_type.datetime)
*****
***** Module Name: af_Delete_Old_Backups
*****
***** Description: Receives 1 parm which contains
***** the path to the backup files. The
***** Procedure will go out to disk and delete dumps
***** that are older than the number of retain days.
*****
***** Written By: Mike Taylor
***** Date: May 11, 1999
*****
***** Modified : Steve Phelps
***** Date: 9/08/99 removed logic to skip sys dbs.
***** 9/28/99 added logic to check date format.
***** 5/01/01 changed format of filename.
***** 5/07/01 modified for additional date format,
***** and changed variable names for readability.
***** 9/30/02 removed retain days parm, now included
***** in the backup filename itself.
***** 1/23/03 changed the @hold_days/@retain_days comparison
***** to >= instead of >. Also added logic that if
***** @retain_days = 99, retention is permanent.
***** USAGE: Exec af_Delete_Old_Backups 'D:\DB_Backup'
********************************************************************/
Declare @todays_date datetime,
@command varchar(128),
@dot_pos int,
@date_pos int,
@date_len int,
@nextfile char(100),
@file_date_txt varchar(10),
@file_date_dt datetime,
@hold_days int,
@date_format_ck int,
@result int,
@ret_pos int,
@retain_days char(2)
/*
--declare for testing
-- declare @path varchar(255)
-- select @path = 'd:\db_backup\'
*/
/* temporary table to hold file names to be deleted */
Create Table #hold_file_name
( #file_name varchar(100) null)
select @todays_date = getdate()
/* build/execute command to populate temporary table */
select @command = 'master..xp_cmdshell ' + '"' + 'dir ' + @path + '/b /A:-D' + '"'
insert #hold_file_name
exec (@command)
/* loop thru all files in temporary table */
select @nextfile = ' '
While @nextfile is not NULL
Begin
start:
Select @nextfile = MIN(#file_name)
From #hold_file_name
Where #file_name > @nextfile
/*
-- 9/8/99 removed this logic...don't wanna hold these forever....sbp
-- --DO NOT DELETE system database dumps (Master and MSDB)
-- if @nextfile like 'master%' or
-- @nextfile like 'msdb%'
-- begin
-- goto start
-- end
*/
If @nextfile is NULL
Begin
Break
END
/* find the . between the file name and extension */
select @dot_pos = charindex('.',@nextfile)
if @dot_pos = 0
Begin
/* no '.' in file name, go get next file */
GOTO start
End
/* assume 12 char datetime for 1st try, if not, try 14 */
select @date_len = 12
extract_date:
/* modified next line for new filename format SBP 5/1/01 */
select @date_pos = @dot_pos - @date_len
/* get 8 character date that the file was created */
select @file_date_txt = substring(@nextfile,@date_pos,8)
/* the following format checking logic added by steve phelps, 9/28/99 */
/* verify that @file_date_txt is in date format. if not, get next file. */
select @date_format_ck = isdate(@file_date_txt)
if @date_format_ck = 0
Begin
if @date_len = 12
begin
select @date_len = 14
GOTO extract_date
end
/* execute next line only if failed for 14 char date */
GOTO start
End
/* find the location of the retention period in the file name */
select @ret_pos = @date_pos - 3
/* verify that the retention period value is numeric */
select @result = isnumeric(substring(@nextfile,@ret_pos, 2))
if @result = 0
Begin
GOTO start
End
/* extract the retention period from the file name */
select @retain_days = substring(@nextfile, @ret_pos, 2)
/* added 1/23/03 - if @retain_days = 99, retention is permanent. */
if @retain_days = '99'
goto start
/* @file_date_txt is in date format, convert to datetime format */
select @file_date_dt = convert(datetime,@file_date_txt)
/* determine number of days between current date and date file was created */
select @hold_days = datediff(dd,@file_date_dt,@todays_date)
/* modified 1/23/03 - changed from > to >= (sbp) */
if @hold_days >= cast(@retain_days as int)
/* file is older than we want to keep, delete it */
Begin
select @command = 'master..xp_cmdshell ' + '"' + 'del ' + @path + ltrim(rtrim(@nextfile)) + '"'
exec (@command)
End
End
/* cleanup */
drop table #hold_file_name
GO
-- edited this post because of an inadvertant smiley face showing up in the code. Steve
April 6, 2004 at 10:10 am
Same issue here...We use Autosys for job scheduling. Since all the jobs are being controled via Autosys and each job has dependencies associated with it SQL backups should be no different. If the SQL backup fails for any reason we are notified via the job monitors. Once the SQL backup is complete other jobs in the batch are executed by virtue of the SQL backup finishing with a success.
I setup the SQL Backups to run via DTS and Autosys uses the DTSRUN command line utility. This works for us.
Russell
Viewing 10 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply