Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Query times inconsistent Expand / Collapse
Author
Message
Posted Monday, November 12, 2007 3:43 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, November 15, 2007 6:46 AM
Points: 3, Visits: 7
Hi - I'm having some performance issues with an application I've recently inherited. I was able to increase the performance of the query by adding some joins instead of subselects and looking at some indexes etc. So now the procedure executes in 6 secs or so with a smaller date range - say 15 days. However the users can still not run the report from the web app - as it always times out. With a six second query, this timeout should not occur.

Also - i'll execute the query over and over in query analyzer, and get a response of six seconds or so and then randomly get 40 secs or more again - which seems like it's compiling. i can't figure out why it would randomly compile (if that is what it's doing). I've read that the cached procs can get pushed out, but this is really quick - how do i verify that the procedure is cached and then check to see if it's pushed out by other processing?

This report is accessing a history table that is constantly being written to by the application, but i've written the query with NOLOCK, so i was hoping that the writers would not block this reader and vice versa. I'm attaching the proc body. it's long because it has two optional parms that i have to check for the queries are all the same, except for dates (last line).

Any input would be fantastic!!

CREATE PROCEDURE dbo.mmds_rpt_test_results_detail
@StartDate varchar(10),
@EndDate varchar(10),
@CustomerID int,
@ReturnError varchar(200) OUTPUT

AS
DECLARE @ErrorNum int

BEGIN

SET NOCOUNT ON

declare @startDt as datetime
declare @endDt as datetime

if @StartDate <> ''
begin
set @startDt = convert(datetime,@StartDate)

if @EndDate <> ''
begin
set @endDt = convert(datetime,@EndDate)

SELECT T_HISTORY.USER_NM,
U.LNAME_NM + ', ' + U.FNAME_NM AS DISPLAY_NM,
EC.EVENT_DESC AS ACTION,
P.LNAME + ', ' + P.FNAME AS PATIENT,
B.INDEXED_ITEM_ID AS PATIENT_ID,
Q.DATA_DESC AS TEST_RESULT,
T_HISTORY.ORIG_DT
FROM dbo.T_HISTORY WITH(NOLOCK)
JOIN dbo.T_INDEX_DOCUMENTS B WITH(NOLOCK) ON B.WORK_ITEM_ID = T_HISTORY.ITEM_ID
AND B.INDEX_TYPE_ID = 1
JOIN dbo.MMDS_CHECKLIST_DATA C WITH(NOLOCK) ON C.WORK_ITEM_ID = T_HISTORY.ITEM_ID
JOIN DBO.MMDS_PATIENT P WITH(NOLOCK) ON P.PATIENT_ID = B.INDEXED_ITEM_ID
JOIN DBO.T_LOOKUP Q ON Q.DATA_CD = C.TEST_RESULTS
AND Q.CUSTOMER_ID = @CustomerID
AND Q.LOOKUP_TABLE_CD = 'TEST_RESULTS'
AND Q.SITE_ID = 0
JOIN dbo.T_USER U ON U.USER_NM = T_HISTORY.USER_NM
AND U.CUSTOMER_ID = @CustomerID
JOIN dbo.T_HISTORY_EVENT_CODES EC ON EC.EVENT_CD = T_HISTORY.EVENT_CD
WHERE T_HISTORY.CUSTOMER_ID=@CustomerID
AND T_HISTORY.STATE_CD = 'RESULTS'
AND T_HISTORY.USER_NM <> 'RULE'
AND T_HISTORY.EVENT_CD IN ('WI_CPLT','WI_RET_TO_QUE')
AND T_HISTORY.ORIG_DT >= @startDt
AND T_HISTORY.ORIG_DT <= @endDt + 1
end
else
begin
SELECT T_HISTORY.USER_NM,
U.LNAME_NM + ', ' + U.FNAME_NM AS DISPLAY_NM,
EC.EVENT_DESC AS ACTION,
P.LNAME + ', ' + P.FNAME AS PATIENT,
B.INDEXED_ITEM_ID AS PATIENT_ID,
Q.DATA_DESC AS TEST_RESULT,
T_HISTORY.ORIG_DT
FROM dbo.T_HISTORY WITH(NOLOCK)
JOIN dbo.T_INDEX_DOCUMENTS B WITH(NOLOCK) ON B.WORK_ITEM_ID = T_HISTORY.ITEM_ID
AND B.INDEX_TYPE_ID = 1
JOIN dbo.MMDS_CHECKLIST_DATA C WITH(NOLOCK) ON C.WORK_ITEM_ID = T_HISTORY.ITEM_ID
JOIN DBO.MMDS_PATIENT P WITH(NOLOCK) ON P.PATIENT_ID = B.INDEXED_ITEM_ID
JOIN DBO.T_LOOKUP Q ON Q.DATA_CD = C.TEST_RESULTS
AND Q.CUSTOMER_ID = @CustomerID
AND Q.LOOKUP_TABLE_CD = 'TEST_RESULTS'
AND Q.SITE_ID = 0
JOIN dbo.T_USER U ON U.USER_NM = T_HISTORY.USER_NM
AND U.CUSTOMER_ID = @CustomerID
JOIN dbo.T_HISTORY_EVENT_CODES EC ON EC.EVENT_CD = T_HISTORY.EVENT_CD
WHERE T_HISTORY.CUSTOMER_ID=@CustomerID
AND T_HISTORY.STATE_CD = 'RESULTS'
AND T_HISTORY.USER_NM <> 'RULE'
AND T_HISTORY.EVENT_CD IN ('WI_CPLT','WI_RET_TO_QUE')
AND T_HISTORY.ORIG_DT >= @startDt
end

end
else
begin
if @EndDate <> ''
begin
set @endDt = convert(datetime,@EndDate)

SELECT T_HISTORY.USER_NM,
U.LNAME_NM + ', ' + U.FNAME_NM AS DISPLAY_NM,
EC.EVENT_DESC AS ACTION,
P.LNAME + ', ' + P.FNAME AS PATIENT,
B.INDEXED_ITEM_ID AS PATIENT_ID,
Q.DATA_DESC AS TEST_RESULT,
T_HISTORY.ORIG_DT
FROM dbo.T_HISTORY WITH(NOLOCK)
JOIN dbo.T_INDEX_DOCUMENTS B WITH(NOLOCK) ON B.WORK_ITEM_ID = T_HISTORY.ITEM_ID
AND B.INDEX_TYPE_ID = 1
JOIN dbo.MMDS_CHECKLIST_DATA C WITH(NOLOCK) ON C.WORK_ITEM_ID = T_HISTORY.ITEM_ID
JOIN DBO.MMDS_PATIENT P WITH(NOLOCK) ON P.PATIENT_ID = B.INDEXED_ITEM_ID
JOIN DBO.T_LOOKUP Q ON Q.DATA_CD = C.TEST_RESULTS
AND Q.CUSTOMER_ID = @CustomerID
AND Q.LOOKUP_TABLE_CD = 'TEST_RESULTS'
AND Q.SITE_ID = 0
JOIN dbo.T_USER U ON U.USER_NM = T_HISTORY.USER_NM
AND U.CUSTOMER_ID = @CustomerID
JOIN dbo.T_HISTORY_EVENT_CODES EC ON EC.EVENT_CD = T_HISTORY.EVENT_CD
WHERE T_HISTORY.CUSTOMER_ID=@CustomerID
AND T_HISTORY.STATE_CD = 'RESULTS'
AND T_HISTORY.USER_NM <> 'RULE'
AND T_HISTORY.EVENT_CD IN ('WI_CPLT','WI_RET_TO_QUE')
AND T_HISTORY.ORIG_DT <= @endDt + 1
end
else
begin
SELECT T_HISTORY.USER_NM,
U.LNAME_NM + ', ' + U.FNAME_NM AS DISPLAY_NM,
EC.EVENT_DESC AS ACTION,
P.LNAME + ', ' + P.FNAME AS PATIENT,
B.INDEXED_ITEM_ID AS PATIENT_ID,
Q.DATA_DESC AS TEST_RESULT,
T_HISTORY.ORIG_DT
FROM dbo.T_HISTORY WITH(NOLOCK)
JOIN dbo.T_INDEX_DOCUMENTS B WITH(NOLOCK) ON B.WORK_ITEM_ID = T_HISTORY.ITEM_ID
AND B.INDEX_TYPE_ID = 1
JOIN dbo.MMDS_CHECKLIST_DATA C WITH(NOLOCK) ON C.WORK_ITEM_ID = T_HISTORY.ITEM_ID
JOIN DBO.MMDS_PATIENT P WITH(NOLOCK) ON P.PATIENT_ID = B.INDEXED_ITEM_ID
JOIN DBO.T_LOOKUP Q ON Q.DATA_CD = C.TEST_RESULTS
AND Q.CUSTOMER_ID = @CustomerID
AND Q.LOOKUP_TABLE_CD = 'TEST_RESULTS'
AND Q.SITE_ID = 0
JOIN dbo.T_USER U ON U.USER_NM = T_HISTORY.USER_NM
AND U.CUSTOMER_ID = @CustomerID
JOIN dbo.T_HISTORY_EVENT_CODES EC ON EC.EVENT_CD = T_HISTORY.EVENT_CD
WHERE T_HISTORY.CUSTOMER_ID=@CustomerID
AND T_HISTORY.STATE_CD = 'RESULTS'
AND T_HISTORY.USER_NM <> 'RULE'
AND T_HISTORY.EVENT_CD IN ('WI_CPLT','WI_RET_TO_QUE')

end

end


SELECT @ErrorNum = @@ERROR

IF @ErrorNum <> 0
BEGIN
RAISERROR('Error in mmds_rpt_test_results_detail', 16,@ErrorNum)
RETURN
END

RETURN 0
END


















Post #421248
Posted Monday, November 12, 2007 6:14 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 12:09 AM
Points: 35,348, Visits: 31,886
This report is accessing a history table that is constantly being written to by the application,


That would be part of the inconsistency... procs recompile when certain amounts of data change... and it doesn't take much.

Dunno about your timeout problem...


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #421281
Posted Monday, November 12, 2007 6:46 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, November 15, 2007 6:46 AM
Points: 3, Visits: 7
It doesn't take much huh? I guess that is why it's recompiling .... the recompile seems to take quite a bit of time too... in the end i'm not sure what i can do to get around this other than set the timeout of the db call to be pretty high. Is there anyway you can set something so it recompiles less often?

I'll have to look around a bit more for details on how much data changing causes a recompile.

Any other input/thoughts/suggestions would be great!
Post #421290
Posted Monday, November 12, 2007 11:31 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 12:09 AM
Points: 35,348, Visits: 31,886
Of course it's going to take some time... your sproc has four relatively large SELECTS and they ALL must be recompiled even if only one is going to be used...

The other thing you may be running into is a thing that happens with variables (I'll be damned if I can remember what they call it)... basically, you need to assign each input variable to an explicitly named variable... can greatly improve the speed if the problem is occuring (what the heck do they call that problem? Variable something or other...)

Believe it or not, another thing you can try is adding the WITH RECOMPILE option to a stored procedure... sometimes it helps even on something like this.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #421336
Posted Monday, November 12, 2007 11:36 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 12:09 AM
Points: 35,348, Visits: 31,886
Now I remember... it's called "Parameter Sniffing" and, apparently, it can crush otherwise effecient code. Lots of good info on the Web about it... Google it...

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #421338
Posted Tuesday, November 13, 2007 1:28 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Today @ 7:26 AM
Points: 40,174, Visits: 36,572
The other problem with that is that when the proc compiles, all of the selects compile, based on the values of the parameters for that call. That's even if the select can't be reached based on those parameters. That behaviour can get you some really erratic performance. Sometimes the query with one set of params runs fine and with a second runs slow, sometimes the second set of params are the fast ones and the first the slow.

I normally recommend that a proc of that form gets split up, so that you call other procs based on the value of the params



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #421388
Posted Tuesday, November 13, 2007 7:45 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, November 15, 2007 6:46 AM
Points: 3, Visits: 7
We are trying to avoid opening the code, but it looks like we may have to.

I will check into the parameter sniffing - previously the query was dynamic - i could flip it back to that quickly to see if that actually increases the compile time.

Thank you for all input - if i have any great improvements i'll reply back.

:)
donna
Post #421597
Posted Tuesday, November 13, 2007 8:44 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 6:39 AM
Points: 7,105, Visits: 15,434
Or, you might care to FORCE the recompile each time it's run, so that it will optimize for the branch that will actually run....

That's adding a WITH RECOMPILE to the SP definition as I recall.


----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
Post #421635
Posted Tuesday, November 13, 2007 9:20 AM
Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Saturday, March 8, 2014 4:25 PM
Points: 1,415, Visits: 1,228
Looks to me like all 4 SELECTs are exactly the same, with just a different filter on T_HISTORY.ORIG_DT based on the input parameters.

Do you really need IF .. THEN blocks with 4 separate branches of execution ?
Couldn't you just set the local date variables to very low/high values if they are passed as blank, and have 1 SELECT ?

eg:
-- Convert blank start date to a date earlier than earliest possible date in
-- your data
Select @startDt = Cast(Case when @StartDate = '' Then '01 Jan 1900' Else @startDt End As DateTime)

Why try to preempt the optimizer with IF statements ? Give it both dates in 1 SELECT, let it optimize based on data distribution and indexes.

Post #421659
Posted Wednesday, November 14, 2007 3:00 PM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 7:27 AM
Points: 1,738, Visits: 3,400
You can wrap the following around the SQL, and it will show you the parse & compile time for each execution. It will also show you the IO - whether logical(in cache) or physical(from disk). If you have cache pressure it could also be that the data is getting flushed. I think this will at least help point you to where the slowdown is happening - sp recompile or data flushed from cache.


set statistics io on
set statistics time on
set statistics profile on
go

--QUERY GOES HERE

go
set statistics io off
set statistics time off
set statistics profile off
go


jg
Post #422340
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse