To add to the reasons not to use undocumented or, sometimes, even documented MS provided code...
The Developers of SQL Server will occasionally make significant performance mistakes in their code just like the rest of us.
I absolutely agree that the msdb.dbo.sysjobschedules will never have enough rows in it to become a real performance concern. I am, however, concerned that someone may copy the code from the msdb.dbo.agent_datetime() function that MS provided and use it for much larger things. If someone were to use that function for converting million row inputs multiple times per day, you could be stressing your server just because of the way the function was built. To summarize and as Michael suggested, conversions to character based datatypes can take a real toll on performance.
Of course, such statements on performance of code aren't worth a hoot without a little evidence. 🙂
Here's the typical million row test table...
--=====================================================================================================================
-- Create and populate the test table.
-- Nothing in this section is a part of the solutions being tested.
-- We're just building the test data here.
--=====================================================================================================================
--===== Conditionally drop the test table to make reruns easier in SSMS
IF OBJECT_ID('tempdb..#TestTable','U') IS NOT NULL
DROP TABLE #TestTable
;
--===== Create and populate the test table on the fly
WITH
cteGenDates AS
(
SELECT TOP (1000000)
SomeDateTime = RAND(CHECKSUM(NEWID()))*DATEDIFF(dd,'2012','2013')+CAST('2012' AS DATETIME)
FROM sys.all_columns ac1,
sys.all_columns ac2
)
SELECT next_run_date = CAST(CONVERT(CHAR(8),SomeDateTime,112) AS INT),
next_run_time = CAST(REPLACE(CONVERT(CHAR(8),SomeDateTime,108),':','') AS INT)
INTO #TestTable
FROM cteGenDates
;
GO
And here's the test harness. You need to setup SQL Profiler to measure this one because SET STATISTICS TIME ON really and unfairly slows the MS code down a lot!
--=====================================================================================================================
-- Do the test using the solutions found so far for converting Integer-based Dates and Times to DATETIME values.
-- To take display times out of the picture, all results are dumped to a "bit-bucket" variable.
-- RUN THIS TEST WITH SQL PROFILER RUNNING TO SEE THE PERFORMANCE DIFFERENCES.
-- Don't use SET STATISTICS TIME ON for this test because it really makes the MS code suffer.
--=====================================================================================================================
GO
--===== Michael's Solution ============================================================================================
--===== Declare the "bit-bucket" variable
DECLARE @Bitbucket DATETIME;
--===== Run the test
select
-- convert date
dateadd(dd,((next_run_date)%100)-1,
dateadd(mm,((next_run_date)/100%100)-1,
dateadd(yy,(nullif(next_run_date,0)/10000)-1900,0)))+
-- convert time
dateadd(ss,next_run_time%100,
dateadd(mi,(next_run_time/100)%100,
dateadd(hh,nullif(next_run_time,0)/10000,0)))
from
#TestTable
;
GO
--===== msdb.dbo.agent_datetime Function ==============================================================================
--===== Declare the "bit-bucket" variable
DECLARE @Bitbucket DATETIME;
--===== Run the test
SELECT @Bitbucket = msdb.dbo.agent_datetime(next_run_date,next_run_time)
FROM #TestTable
;
GO
Here are the results on my 9 year old, single cpu war-horse...
Heh... Michael must be slipping... his code is "only" 13 times faster. 😛
--Jeff Moden
Change is inevitable... Change for the better is not.