Why Object Qualification is important.

  • Hi Gregory, Thats worth a try. I will try to do that and see if I can find anything.

    Jack, Thanks for answering that. I was not sure about it to say for certain that it wont have an issue.

    -Roy

  • Good article Roy.

    Jack, ad hoc queries are compiled. I believe this started in 2000, if not, then definitely 2005.

    Also, that brings up another point, I guess this article was written for 2000 because 2005 Profiler no longer has the event ExecContextHit (that I see). It does have one under Stored Procedures called "Deprecated", maybe that is it?

    Btw, when I ran my profile on 2005, I get way more cache misses in both cases than the article says. However, bottom line is that I get twice as many lines in Profiler when I don't qualify as when I do. So it seems like a good idea to do so.

    Paul

  • Out of curiosity, does specifying the 3-part name for a stored procedure (e.g. srienstr.dbo.sp_diabetes) prevent the check of master for procedures with a prefix of 'sp_'?

    Not that I've been foolish enough to make 17 such stored procedures...


    Puto me cogitare, ergo puto me esse.
    I think that I think, therefore I think that I am.

  • Hi srienstr,

    No, Even if you provide the DB and the object qualifier, it will still go through all the hassles of Acquiring a Lock and then releasing it.

    -Roy

  • Hi Paul,

    Yes, you are right. I had both 2005 and 2000 installed in my local box and ran the tests under 2000 instance.

    -Roy

  • Bother. At least I have them running separately through the Job Agent rather than calling each other so I'll only have to change them in one place. Thanks for letting me know Roy.


    Puto me cogitare, ergo puto me esse.
    I think that I think, therefore I think that I am.

  • Fun article and great discussion. Thanks.

  • I did a quick test kind of following Gregory's suggestion except I did not use profiler I user Set Statistics Time On. The first thing I did was create a user who did not "own" the dbo schema nor have dbo as it's default schema, but did have select permissions on the dbo schema. I wanted to make sure I was crossing schemas. I also ran DBCC DropCleanBuffers and DBCC FreeProcCache between each execution. Then I ran it only executing the DBCC statements before the first run of each query. The query did use a covering non-clustered index as well. I was trying to stick it in here, but decided the formatting wasn't appropriate so I attached an Excel file instead.

    A quick summary is that, if the plan is in cache they took about the same time. When the plan was not in cache, the non-qualified query took about 33ms to parse and compile and the qualified query to 1ms to parse and compile. This is a very big difference in performance.

    I also ran it with the users default schema being DBO, results not in the Excel File, and the Parse and Compile times were the same after running the DBCC statements. This would lead me to believe that in SQL 2005, if you have only 1 schema, that they both perform the same way. I am not going to make any definitive statements based on this limited testing though.

    I think the biggest argument for using fully qualified names in ad-hoc SQL is consistency so that you only get 1 plan in cache for each query vs. 1 for non-qualified queries and 1 for qualified queries. This allows you to have more distinct plans in the cache.

    Paul,

    I guess I was slightly confused in terminology between compiled and cached. I know that ad-hoc queries plans are cached and, after looking it up in BOL, realize that this is what compilation really is. Thanks for the prompt.

  • Hey Jack,

    Thanks for the excel file. Thats a pretty good testing.. 🙂

    -Roy

  • To recap...is the consensus that using non-fully qualified object names, including for tables, stored procedures, and other objects, will have a performance hit for both SQL 2000 and SQL 2005?

  • Sideout, Yes it is always better to call objects with their qualifier.

    -Roy

  • Good article, but perhaps a bit overreaching in its conclusions: "The best thing to do is to make sure you qualify all objects with the Schema (Also known as Owner)"

    Well, good advice if you're dealing with one server and single schemas or need to optimize for in specific instances. But if you're writing procedures that need to run across many schemas, servers, and automatically pick the right tables without undue hard coding makes sense.

    For example, I have several lookup tables with the same names but with different data on different servers, schemas, and so on. But my stored procedures are the same across all machines. By not using dbo, I let the appropriate table get picked up at execution time. Price I pay is minimal compared to managing customized code for each server. To save compute cycles, fully qualified names could make a difference in certain circumstances, but to save programmer cycles, then the substitution model could be your friend.

  • sjsubscribe (3/10/2008)


    Well, good advice if you're dealing with one server and single schemas or need to optimize for in specific instances. But if you're writing procedures that need to run across many schemas, servers, and automatically pick the right tables without undue hard coding makes sense.

    For example, I have several lookup tables with the same names but with different data on different servers, schemas, and so on. But my stored procedures are the same across all machines. By not using dbo, I let the appropriate table get picked up at execution time. Price I pay is minimal compared to managing customized code for each server. To save compute cycles, fully qualified names could make a difference in certain circumstances, but to save programmer cycles, then the substitution model could be your friend.

    Do you have difference lookup tables in the same database with different owners, or are they owned by dbo but in different databases/on different servers? If you need different results within the same database then the substitution method makes sense, but using the two-part name wouldn't impact the functionality you're looking for if the relevant tables are all owned by dbo in different databases.

    Three part naming is obviously right out in your situation, but my understanding is that three part naming has no performance advantage to the more flexible two part naming.


    Puto me cogitare, ergo puto me esse.
    I think that I think, therefore I think that I am.

  • Do you have difference lookup tables in the same database with different owners, or are they owned by dbo but in different databases/on different servers?

    Yes.

    he best thing to do is to make sure you qualify all objects with the Schema...

    No. It's not the best thing.

  • sjsubscribe (3/10/2008)


    Do you have difference lookup tables in the same database with different owners, or are they owned by dbo but in different databases/on different servers?

    Yes.

    Are you saying that both apply, or was I unclear in phrasing my question?

    he best thing to do is to make sure you qualify all objects with the Schema...

    No. It's not the best thing.

    While basically everything falls into "It depends", it is still often, if not quite universally, advisable. 32ms of compile time may be insignificant, and a few ms on each run may be acceptable, but unless there is a specific need to choose a less efficient method, efficient is preferred.


    Puto me cogitare, ergo puto me esse.
    I think that I think, therefore I think that I am.

Viewing 15 posts - 16 through 30 (of 38 total)

You must be logged in to reply to this topic. Login to reply