Better Practices for writing this query?

  • I'm trying to put this basic code below into a parameterized View and need to simplify where possible... any ideas? e.g. declaring a variable to reduce the times the dbo.fnGetSeconds(Call_Time) is written:

    SELECT [CALL_ID], [DATE],
    dbo.fnGetSeconds(CALL_TIME) AS [CALL TIME (secs)],
    dbo.fnGetSeconds(TALK_TIME) AS [TALK TIME (secs)],
    dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) AS [diff],

    CASE
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) <= 9 then '1'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 10 AND 59 then '2'
    end AS [ANALYSIS]

    FROM [dbo].[CallLogCommon]

    WHERE [DATE] >= '2021-03-01' AND [DATE] <= '2021-03-25'
  • My preferred approach is to use a CROSS APPLY to assign an alias name to the computation.  That name can then be used in any other clause in the SELECT, including in a WHERE, JOIN or GROUP BY.

    SELECT [CALL_ID], [DATE],
    CALL_TIME_SECS AS [CALL TIME (secs)],
    TALK_TIME_SECS AS [TALK TIME (secs)],
    CALL_TIME_SECS - TALK_TIME_SECS AS [diff],

    CASE
    WHEN CALL_TIME_SECS - TALK_TIME_SECS <= 9 then '1'
    WHEN CALL_TIME_SECS - TALK_TIME_SECS BETWEEN 10 AND 59 then '2'
    end AS [ANALYSIS]

    FROM [dbo].[CallLogCommon]
    CROSS APPLY (
    SELECT dbo.fnGetSeconds(CALL_TIME) AS CALL_TIME_SECS,
    dbo.fnGetSeconds(TALK_TIME) AS TALK_TIME_SECS
    ) AS alias1

    WHERE [DATE] >= '2021-03-01' AND [DATE] <= '2021-03-25'

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • Excellent tool! If I wanted to not hardcode the dates how could I declare and pass these in as variables?

  • DECLARE @date1 date
    DECLARE @date2 date

    SELECT ...
    ...
    CROSS APPLY (
    SELECT dbo.fnGetSeconds(@date1) AS date1_secs,
    dbo.fnGetSeconds(@date2) AS date2_secs
    ) AS alias1

    ...

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • Same as Scott's but can use BETWEEN instead of two ANDs and can perform the subtraction in another CROSS APPLY.

    Also, it is good practice to always have an ELSE in a CASE statement.

    DECLARE @date1 date = '2021-03-01'
    DECLARE @date2 date = '2021-03-25'

    SELECT [CALL_ID],
    [DATE],
    x.CALL_TIME_SECS AS [CALL TIME (secs)],
    x.TALK_TIME_SECS AS [TALK TIME (secs)],
    y.CALL_MINUS_TALK_SECS AS [diff],
    CASE WHEN y.CALL_MINUS_TALK_SECS <= 9 THEN '1'
    WHEN y.CALL_MINUS_TALK_SECS BETWEEN 10 AND 59 THEN '2'
    ELSE '?'
    END AS [ANALYSIS]
    FROM [dbo].[CallLogCommon] c
    CROSS APPLY (VALUES (dbo.fnGetSeconds(c.CALL_TIME), dbo.fnGetSeconds(c.TALK_TIME))) x(CALL_TIME_SECS, TALK_TIME_SECS )
    CROSS APPLY (VALUES (x.CALL_TIME_SECS - x.TALK_TIME_SECS)) y(CALL_MINUS_TALK_SECS )
    WHERE [DATE] BETWEEN @date1 AND @date2;
  • My apologies, and I need to back up here as I over-simplified my base challenge and its code. I have a query that will be used as the datasource for a report thus the need for the two date variables (which you will notice for testing are hard-coded). The code is working except for not being able to pass in the dates which will be done at the Microsoft Report Builder level:


    SELECT [CALL_ID], [DATE],
    dbo.fnGetSeconds(CALL_TIME) AS [CALL TIME (secs)],
    dbo.fnGetSeconds(TALK_TIME) AS [TALK TIME (secs)],
    dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) AS [diff],

    CASE
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) <= 9 then '1'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 10 AND 59 then '2'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 60 AND 179 then '3'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 180 AND 299 then '4'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 300 AND 599 then '5'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) >= 600 then '6' else 0 end AS [ANALYSIS],

    ------------------------ <9 seconds-------------------------
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) <= 9
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 9 secs],

    -------------------------- < 1 min ------------------------------
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 10 AND 59
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 1 Min],

    ---------------------------- 1 - 3 minutes -------------------------
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 60 AND 179
    then count(DISTINCT [DISPOSITION]) else '' end as [1-3 MINUTES],

    ------------------------ 3 - 5 minutes -------------------------------------
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 180 AND 299
    then count(DISTINCT [DISPOSITION]) else '' end as [3 - 5 MINTUES],

    ---------------------------- 5 - 10 minutes ------------------------------------------
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 300 AND 599
    then count(DISTINCT [DISPOSITION]) else '' end as [5 - 10 MINUTES],

    ------------------------------------- 10+ minutes ----------------------------------------
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) >= 600
    then count(DISTINCT [DISPOSITION]) else '' end as [10+ MINUTES],

    [AGENT], [DISPOSITION], com.[CAMPAIGN]
    FROM [dbo].[CallLogCommon] com with (NOLOCK)
    JOIN [dbo].[Campaigns] ud with (NOLOCK)
    ON com.[campaign] = ud.[campaign]

    WHERE (ud.[Client] like 'Better%') AND [DATE] >= '2021-03-01' AND [DATE] <= '2021-03-25'
    and ([disposition] like 'Live Transfer%' OR [DISPOSITION] LIKE 'Transfer Made%'
    OR [DISPOSITION] = 'Transferred to 3rd Party') AND [DOMAIN] = '05'

    group by [agent], [DISPOSITION], [CALL_ID], [date], [call_time], [TALK_TIME], com.[CAMPAIGN]
    order by [< 9 secs], [< 1 Min], [1-3 MINUTES], [3 - 5 MINTUES], [5 - 10 MINUTES], [10+ MINUTES] desc
  • Judging by the way it's being used, the dbo.fnGetSeconds function appears to be a scalar function.  That's really not a good thing.  If you would post the complete code for that function, we could show you a way to make your current problem virtually disappear as well as bringing some remarkable performance into play.

    As a bit of a side bar, I cut my teeth on SQL Server more than 20 years ago writing a world-wide long distance call accounting system... done 100% in T-SQL and I loved doing it.  That was working with some really fun data.  I could do on a desktop server (a 33 MHz 486 DX with SCSI disks at the time) in four hours with just one person (me) in the "Information Technology Group" that our competition would take a month to do with 100 people in their IT group.

    With that, I'll say I'm a bit jealous of you because of the work you're doing. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Here is the function:

    CREATE FUNCTION [dbo].[fnGetSeconds]
    (
    @TimeFormatted varchar(10)
    )
    RETURNS decimal(10, 2)
    AS
    BEGIN
    RETURN
    (SELECT (LEFT(@TimeFormatted,2)*3600) +
    ROUND(DATEDIFF(MS, 0, '00' + RIGHT(@TimeFormatted,LEN(@TimeFormatted)-2))/1000.0,0)
    AS 'TimeSeconds')
    END
  • You should convert your 'script' to a stored procedure with parameters.  Then in Report Builder you will use the stored procedure as the source for your data and build report parameters tied to the stored procedure parameters.

    Your function can easily be converted to an inline-table valued function.  But - it can probably be optimized even further if you showed an example of the data that would be passed into the function.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Also - don't sort the data in SQL Server, and depending on the report(s) you want you may consider just returning detail data from the stored procedure without any grouping or order.  You can then create separate reports - a detail report and a summary report (totals) and even a drill-down from summary to detail in a single report.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • The data going into the function is in the form like 17:03:26 .... Here is my stored procedure but it probably needs some work:

    USE [a2wh]
    GO

    SET ANSI_NULLS ON
    GO

    SET QUOTED_IDENTIFIER ON
    GO

    CREATE PROCEDURE [a2wh].[dbo].[CallLogPerformanceReport]

    @BeginDate date,
    @EndDate date,
    @client nvarchar(100)

    AS
    BEGIN

    SET NOCOUNT ON;


    SELECT [CALL_ID], [DATE],
    dbo.fnGetSeconds(CALL_TIME) AS [CALL TIME (secs)],
    dbo.fnGetSeconds(TALK_TIME) AS [TALK TIME (secs)],
    dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) AS [diff],

    CASE
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) <= 9 then '1'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 10 AND 59 then '2'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 60 AND 179 then '3'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 180 AND 299 then '4'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 300 AND 599 then '5'
    WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) >= 600 then '6' else 0 end AS [ANALYSIS],


    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) <= 9
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 9 secs],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 10 AND 59
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 1 Min],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 60 AND 179
    then count(DISTINCT [DISPOSITION]) else '' end as [1-3 MINUTES],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 180 AND 299
    then count(DISTINCT [DISPOSITION]) else '' end as [3 - 5 MINTUES],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 300 AND 599
    then count(DISTINCT [DISPOSITION]) else '' end as [5 - 10 MINUTES],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) >= 600
    then count(DISTINCT [DISPOSITION]) else '' end as [10+ MINUTES],

    [AGENT], [DISPOSITION], com.[CAMPAIGN]
    FROM a2wh..CallLogCommon com with (NOLOCK)
    JOIN a2wh..Campaigns ud with (NOLOCK)
    ON com.[campaign] = ud.[campaign]

    WHERE (ud.[Client] = @Client) AND com.DATE >= @BeginDate AND com.DATE <= @EndDate
    and ([disposition] like 'Live Transfer%' OR [DISPOSITION] LIKE 'Transfer Made%'
    OR [DISPOSITION] = 'Transferred to 3rd Party') AND [DOMAIN] = '05'

    group by [agent], [DISPOSITION], [CALL_ID], [date], [call_time], [TALK_TIME], com.[CAMPAIGN]
    order by [< 9 secs], [< 1 Min], [1-3 MINUTES], [3 - 5 MINTUES], [5 - 10 MINUTES], [10+ MINUTES] desc

    END

    GO

     

  • You could change your function to this:

    CREATE FUNCTION [dbo].[fnGetSeconds]
    (
    @TimeFormatted varchar(10)
    )
    RETURNS Table
    WITH schemabinding
    AS
    RETURN
    SELECT (LEFT(@TimeFormatted,2)*3600) +
    ROUND(DATEDIFF(MS, 0, '00' + RIGHT(@TimeFormatted,LEN(@TimeFormatted)-2))/1000.0,0)
    AS 'TimeSeconds';
    go

    There are other ways to write this - your assumes no more than 2 digits for hours for a max of 99 hours.  If your formatted time will always be less than 24 hours you can simplify the above to: cast(datediff(second, '00:00:00.000', @TimeFormatted) as decimal(10,2))

    For your stored procedure:

    CREATE PROCEDURE [a2wh].[dbo].[CallLogPerformanceReport]

    @BeginDate date,
    @EndDate date,
    @client nvarchar(100)

    AS
    BEGIN

    SET NOCOUNT ON;


    SELECT [CALL_ID], [DATE],
    cc.TimeSeconds AS [CALL TIME (secs)],
    tt.TimeSeconds AS [TALK TIME (secs)],
    cc.TimeSeconds - tt.TimeSeconds AS [diff],

    CASE
    WHEN ct.TimeSeconds - tt.TimeSeconds <= 9 then '1'
    WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 10 AND 59 then '2'
    WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 60 AND 179 then '3'
    WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 180 AND 299 then '4'
    WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 300 AND 599 then '5'
    WHEN ct.TimeSeconds - tt.TimeSeconds >= 600 then '6' else 0 end AS [ANALYSIS],


    CASE WHEN ct.TimeSeconds - tt.TimeSeconds <= 9
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 9 secs],
    CASE WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 10 AND 59
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 1 Min],
    CASE WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 60 AND 179
    then count(DISTINCT [DISPOSITION]) else '' end as [1-3 MINUTES],
    CASE WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 180 AND 299
    then count(DISTINCT [DISPOSITION]) else '' end as [3 - 5 MINTUES],
    CASE WHEN ct.TimeSeconds - tt.TimeSeconds BETWEEN 300 AND 599
    then count(DISTINCT [DISPOSITION]) else '' end as [5 - 10 MINUTES],
    CASE WHEN ct.TimeSeconds - tt.TimeSeconds >= 600
    then count(DISTINCT [DISPOSITION]) else '' end as [10+ MINUTES],

    [AGENT], [DISPOSITION], com.[CAMPAIGN]
    FROM a2wh..CallLogCommon com with (NOLOCK)
    JOIN a2wh..Campaigns ud with (NOLOCK)
    ON com.[campaign] = ud.[campaign]

    CROSS APPLY dbo.fnGetSeconds(com.CALL_TIME) As ct --assuming call_time comes from 'com' table
    CROSS APPLY dbo.fnGetSeconds(com.TALK_TIME) As tt --assuming talk_time comes from 'com' table

    WHERE (ud.[Client] = @Client) AND com.DATE >= @BeginDate AND com.DATE <= @EndDate
    and ([disposition] like 'Live Transfer%' OR [DISPOSITION] LIKE 'Transfer Made%'
    OR [DISPOSITION] = 'Transferred to 3rd Party') AND [DOMAIN] = '05'

    group by [agent], [DISPOSITION], [CALL_ID], [date], [call_time], [TALK_TIME], com.[CAMPAIGN]
    order by [< 9 secs], [< 1 Min], [1-3 MINUTES], [3 - 5 MINTUES], [5 - 10 MINUTES], [10+ MINUTES] desc

    END

    GO

    You should always reference the schema - using the .. syntax isn't recommended.  You should also use the table aliases for all columns - that way you know what table the columns are coming from and it is easier to manage.

    Since this is going to be a source for a report - I would recommend not using column names with spaces or special characters.  You can easily modify column headers in the report definition.  So instead of [CALL TIME (secs)] use call_time_secs - and you can change it as needed in the report.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Dave,

    I'm a bit confused by the functionality you express in your code.  You're subtracting Talk_Time from the Call_Time.  Perhaps things have changed a bit but, IIRC, Talk_Time is the duration of the call and Call_Time is the time of day when the call was placed.

    With that, the subtraction of Talk_Time from Call_Time is making no sense to me.  Can you clarify my misunderstanding so I can take a shot at this from a slightly different angle?  What are the definitions of Talk_Time and Call_Time?

    EDIT:  Ah... from the looks of it, Call_Time might be the full duration of the call including things like time-on-hold whereas Talk_Time is the actual amount of time the two parties were talking to each other.  So you're using the Call_Time - Talk_Time to calculate the "Time-On_Hold" (kind of)?

    • This reply was modified 3 years ago by  Jeff Moden. Reason: Added a revelation on my part as an additional clarification question

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Ok... the premise of this thread is that we want to use "Better Practices". 😀  We not only want good code, but we want it to be simple and a "Better Practice" is to document the code as you write.  Another "Better Practice" is to worry about performance no matter how small you might think a table will remain.  If it doesn't work quickly on at least a million rows, you probably haven't done the right job.

    Step 1 is as I originally suggested and others have written code for and that's to rewrite the scalar function.  Now, if you're going to use a prefix on the name of the function, we might as well do a good job of it.  There are 3 basic different user defined functions an so "fn" doesn't actually help much.  Instead, let's use "itvf" as the prefix because it IS an "inline Table Valued Function".

    Also, let's think outside the box a bit.  Parsing something from the hh:mi:ss format isn't just a pain in the butt, but it also usually slows things down even though they may still run like greased lightning.  If we compare what is properly formatted as a time to 0 time (1900-01-01 00:00:00.000, which is "day 0" in SQL Server), it super easy to do a DATEDIFF() to get the seconds with no parsing.  We also end up with something that's "twice as fast as greased lightning).  😀  And, yeah... milliseconds always matter and it usually takes less code to get them. 😉

    So, here's the iTVF I wrote to replace your scalar function.  The code is so small, it's actually a shame to write a function for it but it does help others use such functionality in a consistently high performance and reliable manner.

    GO
    DROP FUNCTION IF EXISTS dbo.itvfGetSeconds;
    GO
    CREATE FUNCTION dbo.itvfGetSeconds
    /*******************************************************************************
    Purpose:
    Given a 24 hour duration in the format of hh:mi:ss, return the number of whole
    seconds in that duration.
    --------------------------------------------------------------------------------
    Programmer Notes:
    1. This function is a high performance iTVF (inline Table Valued Function) and
    must be used in either a FROM clause or an APPLY operator in a FROM clause.
    2. If the duration or the format of the duration is not valid, the function
    will return a NULL.
    3. Minimum input duration allowed = 00:00:00
    Maximum input duration allowed = 23:59:59
    4. This function is generally twice as fast as those that parse the input.
    --------------------------------------------------------------------------------
    Example Recommended Usage:
    --===== Do the conversion for a whole column in a given table.
    SELECT sec.DurSec,<<st.ColumnList>>
    FROM <<SomeQualifiedTableName>> st.
    CROSS APPLY dbo.fnGetSeconds(<<SomeDurationColumn>>) sec
    ;
    --------------------------------------------------------------------------------
    Revision History:
    Rev 00 - 29 Mar 2021 - Jeff Moden
    - Initial creation and unit test.
    - Ref: https://www.sqlservercentral.com/forums/topic/better-practices-for-writing-this-query
    *******************************************************************************/
    --===== Function Parameters
    (@pFmtDur CHAR(8)) --Must be in the hh:mi:ss format.
    RETURNS TABLE WITH SCHEMABINDING AS
    RETURN SELECT DurSec = DATEDIFF(ss,0,TRY_CONVERT(DATETIME,@pFmtDur))
    ;
    GO

    Up next... it's difficult to guess what a good recommendation would be without some test data... LOTS of test data.  This isn't yet complete but it'll help you see what I'm thinking.  The following creates 10 million rows of test data and adds indexes, etc, in about 30 seconds on my machine.  Of course, the time will vary from machine to machine.

    --=====================================================================================================================
    -- Create the test table, the Clustered Index, and the NONclustered PK.
    -- THIS IS NOT A PART OF THE SOLUTION! We're just creating test data to work with and test against.
    --=====================================================================================================================
    DROP TABLE IF EXISTS #TestTable;
    GO
    WITH cteBaseData AS
    (--==== The [Date] is random from 2020-01-01 'till now.
    SELECT TOP 10000000 --Yeah... might as well go for the gusto. This whole thing takes about 30 seconds on my box
    [Date] = CONVERT(DATE,DATEADD(dd,ABS(CHECKSUM(NEWID())%DATEDIFF(dd,'2020',GETDATE())),'2020'))
    ,Talk_Time = CONVERT(CHAR(8),DATEADD(ss,ABS(CHECKSUM(NEWID())%(86400-800)),0),108)
    FROM sys.all_columns ac1
    CROSS JOIN sys.all_columns ac2
    )
    SELECT Call_ID = IDENTITY(INT,1,1)
    ,[Date] = ISNULL([Date],'00010101') --ISNULL makes the resulting column NOT NULL
    ,Agent = ABS(CHECKSUM(NEWID())%10)+1 --Agent numbers are from 1 to 10
    ,Client = CONVERT(NVARCHAR(100),CONCAT('Client#',ABS(CHECKSUM(NEWID())%10)+1))
    ,Talk_Time
    ,Call_Time = CONVERT(CHAR(8),DATEADD(ss,ABS(CHECKSUM(NEWID())%800),Talk_Time),108)
    INTO #TestTable
    FROM cteBaseData
    ORDER BY [Date]
    ;
    --===== Since most queries will be done by [Date], we'll use put the Clustered Index on the [Date] column
    -- and make it UNIQUE by adding the Call_ID (SQL Server LOVES unique clustered indexes)
    ALTER TABLE #TestTable ADD CONSTRAINT CI_By_Date_Call_ID PRIMARY KEY CLUSTERED ([Date],[Call_ID])
    ;
    --===== We still need for the Call_ID column to be UNIQUE and it IS the PK for this table.
    -- Note that the PK does NOT have to be the Clustered Index.
    ALTER TABLE #TestTable ADD CONSTRAINT PK_#TestTable PRIMARY KEY NONCLUSTERED (Call_ID)
    ;
    --===== Let's see a small sample of what we built.
    SELECT TOP 100 * FROM #TestTable ORDER BY Call_ID
    ;

    Now... in order for us to help you make your code "Better", we need to know your data.  For example, I'm just guessing that CALL_ID is unique in this particular table.  You need to post the CREATE TABLE statement and all related indexes/key constraints for both of the tables in your posted code so far.

    You also say that you're successfully generated a report... can you post a graphic of what a page of that report actually looks like, please?

    In the mean time, here's a small demonstration of how to DRY out your code to make it simpler.  Obviously, we have a way to go to get to the final product but it's a small demo of where we're headed.  Note how I've eliminated the repetitive CALL_TIME-TALK_TIME calculations all over the place.  I've also greatly simplified the construction of the "Analysis" column simply by reversing the logic in the CASE statement, which always processes from the top down. (And, yeah... I hate the fact that this forum almost double-spaces every bit of code... it makes the code look bloody HUGE).

    --===== These are the parameters that we'd use in a stored procedure (without the preassignments).
    DECLARE @BeginDate DATE = '20200601'
    ,@EndDate DATE = '20200630'
    ,@client NVARCHAR(100) = 'Client#5' --Just an example for testing.
    ;
    WITH cteDry AS
    (--==== Do all the previously repetative calculations that did the "diff" in a cte so we
    -- "Don't Repeat Yourself". This KISSes the code (Keep It Super Simple).
    -- This also seriously reduces the number of rows we have to worry about early.
    SELECT Call_ID
    ,[Date]
    ,Agent
    ,Client
    ,CallTimeSec = ctsec.DurSec
    ,TalkTimeSec = ttsec.DurSec
    ,DiffSec = ctsec.DurSec-ttsec.DurSec
    FROM #TestTable
    CROSS APPLY dbo.itvfGetSeconds(Call_Time) ctsec
    CROSS APPLY dbo.itvfGetSeconds(Talk_Time) ttsec
    WHERE [Date] >= @BeginDate AND [Date] <= @EndDate
    AND Client = @Client
    )
    SELECT *
    ,Analysis = CASE --CASE conditions execute in the order given by the WHEN's
    --so doing this in reverse-order greatly simplifies the code.
    WHEN DiffSec >= 600 THEN 6
    WHEN DiffSec >= 300 THEN 5
    WHEN DiffSec >= 190 THEN 4
    WHEN DiffSec >= 60 THEN 3
    WHEN DiffSec >= 10 THEN 2
    WHEN DiffSec >= 1 THEN 1
    ELSE 0 --This will probably never happen
    END
    FROM cteDry
    ;

    Something else we need to know... what does the following section of code do for you other than pivot the data?  Why are you counting only distinct dispositions here?  The reason I ask is because there's an opportunity for "Pre-Aggregation" prior to the pivot. It makes the code a bit long but nearly twice as fast.  Of course, DRYing out the code like we have will also make the code a whole lot simpler to understand and maintain.

    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) <= 9 
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 9 secs],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 10 AND 59
    then COUNT(DISTINCT [DISPOSITION]) else '' end as [< 1 Min],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 60 AND 179
    then count(DISTINCT [DISPOSITION]) else '' end as [1-3 MINUTES],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 180 AND 299
    then count(DISTINCT [DISPOSITION]) else '' end as [3 - 5 MINTUES],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) BETWEEN 300 AND 599
    then count(DISTINCT [DISPOSITION]) else '' end as [5 - 10 MINUTES],
    CASE WHEN dbo.fnGetSeconds(CALL_TIME) - dbo.fnGetSeconds(TALK_TIME) >= 600
    then count(DISTINCT [DISPOSITION]) else '' end as [10+ MINUTES],

    Sounds like a lot of questions but it takes some questions to get to "Better Practices".  Be patient... it'll be worth it. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • How do I integrate your

    CROSS APPLY dbo.itvfGetSeconds(Call_Time) ctsec

    CROSS APPLY dbo.itvfGetSeconds(Talk_Time) ttsec

    code with the needed code below?:

    FROM a2wh.dbo.CallLogCommon com with (NOLOCK)
    JOIN a2wh.dbo.Campaigns ud with (NOLOCK)
    ON com.[campaign] = ud.[campaign]

Viewing 15 posts - 1 through 15 (of 18 total)

You must be logged in to reply to this topic. Login to reply