Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

Parameters vs. No Parameters Expand / Collapse
Author
Message
Posted Thursday, April 11, 2013 9:53 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Today @ 3:30 PM
Points: 41,531, Visits: 34,448
Yes, absolutely.

I suspect you have stale stats. The version with the getdate, the optimiser can sniff the values and use those values to get an estimate from the row distribution in the statistics, whereas with variables if can't see the values and hence can only get a generic row estimation. If your stats are off (very common on a table with an ascending datetime column), the version with the function could get a completely incorrect (way low) estimate and generate an exec plan that's optimal for very few rows. When the query runs and gets lots of rows the plan is very, very, very bad indeed.

I've seen a query go from 30 minutes to not finishing after 4 hours because of that problem. The exec plan, if you look only at the estimated plan, looks fine, seeks, key/rid lookups, nested loop joins and low row estimations (like the one row estimated in the plan you attached). It's only with the actual plan you can see the extreme mis-estimate on the row count.

http://sqlinthewild.co.za/index.php/2011/03/22/statistics-row-estimations-and-the-ascending-date-column/



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1441352
Posted Friday, April 12, 2013 2:38 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Wednesday, April 16, 2014 9:52 AM
Points: 51, Visits: 265
Very interesting indeed.
I checked my stats, they were almost 48 hours out of date when I was running the query yesterday (and have not been updated as of yet). There are about 40,000 rows added to this table on a daily basis and its has over 13 million rows in total.
The thing is, the filter on my query looks at the last 3 months but starting at the 1st of the month. So its not concerned with the last 11 days.
I'm still confused...
Post #1441607
Posted Friday, April 12, 2013 2:41 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Today @ 3:30 PM
Points: 41,531, Visits: 34,448
Try updating stats with full scan and then the query with the getdates again.

To diagnose the problem, I really need to see the actual plan of that query.



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1441609
Posted Friday, April 12, 2013 8:30 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Wednesday, April 16, 2014 9:52 AM
Points: 51, Visits: 265
I updated the statistics on each of the 4 tables in the query, on by one.
The first 3 updates had no impact, but when I updated the 4th table stats, it ran in 40 seconds!
But I still don't understand how this made such a difference. The actual query plan is attached...


  Post Attachments 
sqlplan_actual_without_variable.sqlplan (2 views, 283.09 KB)
Post #1441741
Posted Friday, April 12, 2013 8:37 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Today @ 3:30 PM
Points: 41,531, Visits: 34,448
I might be able to explain, if I knew which table the '4th table' was.

Is that the actual plan from before the stats update or after?



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1441744
Posted Friday, April 12, 2013 8:47 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Wednesday, April 16, 2014 9:52 AM
Points: 51, Visits: 265
Oops
That would be the transactions table.
That's the actual plan after the stats update.
Now that I think of it, I didn't specify fullscan. But I think this is the default right?
Post #1441748
Posted Friday, April 12, 2013 8:52 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Today @ 3:30 PM
Points: 41,531, Visits: 34,448
johnmcp (4/12/2013)
Oops
That would be the transactions table.


I'll take a look a bit later if I find the time.

Now that I think of it, I didn't specify fullscan. But I think this is the default right?


Depends on the size of the table. Large tables default is sampled.



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1441753
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse