Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

The Need For Speed - Upgrading Your Servers Expand / Collapse
Author
Message
Posted Thursday, March 25, 2004 5:46 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 4:29 PM
Points: 32,819, Visits: 14,965
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnis






Follow me on Twitter:
@way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #108337
Posted Monday, March 29, 2004 10:11 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, January 02, 2013 2:35 PM
Points: 194, Visits: 86
Thanks for the article Steve. It left me wondering though.

As you say your metrics looked good except at certain peak times for certain timezones/geography. So what can one do to alviete the burden of those peak times without upgrading the whole box in all deminsions (or the ones peaking)?

Perhaps the developers could identify the jobs that are sucking the most resources at those peak times and scheme up some sort of pre-processing that could be done in times leading up to the peak.

I guess I approach this from a developer point of view. The first thing I'd do is an over the shoulder analysis of the work performed by users at this peak time, it could turn out that roundtrips to the db could be reduced simply by small changes to application design. I have seen apps where the user ends up having to load up a data heavy screen simply to access a link or a button. So the db is doing a bunch of work to deliver data tghat the users doesn't actual care about. Multiple that by 5,000 seats at tax time and I guess it would save cycles simply to offer them a direct link to the info they want.

I guess there are dozens of schemes that developers can employ; pre-processing, distributed processing, or other design changes but in the final analysis weighing developer time versus a shiny new box, the box might be allot cheaper


Dave

Trainmark.com IT Training B2B Marketplace
(Jobs for IT Instructors)
Post #108913
Posted Tuesday, March 30, 2004 5:58 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Saturday, April 19, 2014 3:49 PM
Points: 2,866, Visits: 1,708

I would second the motion to investigate the peaks.

I work on a large web CMS that has a complicated caching mechanism.  If I get the CMS caching right the database load drops as most stuff gets retrieved from the cache.

User perception is also a problem.  I remember thinking an IBM AT was fast!!

Today the performance wows them, tomorrow exactly the same performance is taken for granted.



LinkedIn Profile
Newbie on www.simple-talk.com
Post #108971
Posted Tuesday, March 30, 2004 6:49 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Wednesday, April 27, 2005 2:37 AM
Points: 311, Visits: 1

Yesterday, I went to a Microsoft seminar where Mogens Nørgaard were speaking about obtaining the best performance for a DB system. Key points were:

1) Always investigate on a job/session level. Do not use overall counters.

2) Find out where precisely the time is spent.

Have a look at http://www.baarf.com/ on why not to use Raid 5.

He was hoping that there were better "system metrics" build into SQL 2005, so that you were able to better pin-point where time was spent. 

Regards,

Henrik Staun Poulsen

Denmark

 




Post #108983
Posted Tuesday, March 30, 2004 4:03 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 4:29 PM
Points: 32,819, Visits: 14,965

As far as this system is concerned, it's a DW backend for a Microstrategy backend, so many of the peaks are not tunable. End suers pick a few things and it generates 50 lines of SQL (or more). So we've built it for the 80-90% of the load. Just can't get the last without a huge upgrade, and even then who knows.

I agree that most of your tuning is in the application and possible indexes. I've had a couple seminars where we really run profiler over a few days, capturing SQL and then grouping and sorting to find the top 10 worst performers and focusing there.  We also look at the frquency of these queries and use that to gauge whether to tune them. I might have a 2 hour query, but if it's run once a year I might not tune it. But a query that takes 2 minutes and it run 100 times a day might be something to focus on.

The reality is that the most frequently complained about queries are the ones we focus on, trying to tune or rework them (and the app) to get the perception of performance down.








Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #109166
Posted Tuesday, April 06, 2004 5:25 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, July 28, 2004 2:51 AM
Points: 1, Visits: 1

If it's for reports try SQL Reporting Services where you can schedule reports delivered to users.

So you can run stuff overnight, overweekend or in offpeak periods and deliver it to users by mail

Post #109973
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse