SQLServerCentral Article

Performance Effects of NOCOUNT

,

Although I have known that there was a performance benenfit in using SET NOCOUNT ON and qualifying objects with their owners I had never actually established benchmarks against either of these two practices.

SET NOCOUNT ON gives a performance boost to action queries by suppressing the "(n row(s) affected) message that results from running a query.

Qualifying an object with it's owner boosts performance because SQL does not have to work out where if there is a user specific version of the same object.

It also gives benefits in the caching of execution plans.

For more details on the affects of not qualifying and object with its owner. see Chris Hedgate's article from the "Worst Practices" series.

Andy Warren and Steve Jones also contriubted articles and I recommend reading all the articles in the series.

This brief article describes my experiments with each of these two settings.

Lies, damn lies and statistics

The ability to provide an experiment that can be repeated and produce similar results is the corner stone to a good scientific method.

As there are many criteria for measuring SQL performance and each have their positive and negative points I am including my methodology to enable you to test my results in your own environement.

One thing you will observe is that any set of benchmarks should not be taken at face value and must be tested in your own particular environment to determine their relevance to your set up.

The classic normal curveWhen I first tried to create a set of benchmarks against SQL Server, not just for this article, I assumed that repeating a particular action would result in times for those actions would fall in the classic bell shaped curve with the average time forming the top of the bell.

This is known as the central limit theorem.

I quickly found that this wasn't the case and that although the performance generally fell within a range of values there appeared to be no particular pattern to the results.

Methodology

My experiment was carried out on a stand-alone server using SQL2000. There was no network connectivity to that server and the experiments were carried out using SQL Management Studio.

My first task was to set up two tables with separate stored procedures to populate them as followsas follows:

CREATE TABLE dbo.TestTable(
TestId int IDENTITY(-2147483648,1) NOT NULL
CONSTRAINT PK_TestTable PRIMARY KEY CLUSTERED,
TestDesc varchar(50)  NOT NULL
) 
GO
CREATE TABLE dbo.TestTable2(
TestId int IDENTITY(-2147483648,1) NOT NULL
CONSTRAINT PK_TestTable2 PRIMARY KEY CLUSTERED,
TestDesc varchar(50)  NOT NULL
) 
GO
CREATE PROC dbo.AddTestWithNodbo
@TestDesc VARCHAR(50)
AS
SET NOCOUNT ON
INSERT INTO TestTable(TestDesc)
VALUES (@TestDesc)
RETURN @@ROWCOUNT
GO
CREATE PROC dbo.AddTestWithdbo
@TestDesc VARCHAR(50)
AS
INSERT INTO dbo.TestTable2(TestDesc)
VALUES (@TestDesc)
RETURN @@ROWCOUNT
GO
CREATE PROC dbo.AddTestWithNoCount
@TestDesc VARCHAR(50)
AS
SET NOCOUNT ON
INSERT INTO TestTable(TestDesc)
VALUES (@TestDesc)
RETURN @@ROWCOUNT
GO
CREATE PROC dbo.AddTestWithoutNoCount
@TestDesc VARCHAR(50)
AS
INSERT INTO dbo.TestTable2(TestDesc)
VALUES (@TestDesc)
RETURN @@ROWCOUNT
GO

As you can see the procedures are virtually identical differing only in whether they use SET NOCOUNT ON or whether they qualify objects with their owners.

All generate identical execution plans.

My next task was to set up a script that would execute a fixed number of itterations of each stored procedure and measure the time taken to carry out those itterations.

DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE
TRUNCATE TABLE dbo.TestTable
TRUNCATE TABLE dbo.TestTable2
GO
DECLARE @StartTime1 DATETIME ,
@EndTime1 DATETIME
@Loop INT ,
@MaxLoop INT
SET @Loop=0
SET @MaxLoop=10000
SET @StartTime1=GETDATE()
WHILE @Loop

I ran the script above 10 times recording the execution time in milliseconds each time. I then replaced exec dbo.AddTestWithNoCount 'AAAA' with exec dbo.AddTestWithoutNoCount 'AAAA' and reran the script a further 10 times.

The results are shown in the graph below

itteration results for SET NOCOUNT

As you can see SET NOCOUNT OFF is consistently slower than SET NOCOUNT ON. All in all results for this experiment reveal a 3% performance advantage in having SET NOCOUNT ON.

I repeated the experiment several times, sometimes starting with itterations of exec dbo.AddTestWithoutNoCount 'AAAA', sometimes with itterations of exec dbo.AddTestWithNoCount 'AAAA' to ensure that I was not introducing bias into the experiment. Overall the performance advantage remained around the 3% margin.

Points to consider

The performance boost is due to the few bytes of information that make up the "(1 row(s) affected)" message not being transmitted to the client application.

With this in mind I should consider the following points

  • Communication between the database and the client application on a stand-alone machine will be as fast as it is possible to get. If your front end application had it's own clock and you recorded the time from submitting the query to the time when the client finally received the full results I would expect that transmitting results across a network to be slower.
  • In this experiment we are carrying out a single simple insert statement. If your procedure carries out multiple operations the performance boost will be more pronounced.
  • For queries that retrieve data the performance boost will be less simply because the size of the "(1 row(s) affected)" message is small compared to the volume of data being returned.
  • In .NET applications an ExecuteNonQuery command returns the number of records affected by the operation. Set NOCOUNT ON means that the value that this call returns is always zero.

Qualified objects vs non-qualified objects

For my experiment to compare qualified and non-qualified objects I used the same methodology that I used for my SET NOCOUNT experiment however I explicitly logged onto my server with a non-dbo user with rights to execute both stored procedures.

When calling the dbo.AddTestWithNodbo stored procedure I deliberately ran
exec AddTestWithNodbo
rather than
exec dbo.AddTestWithNodbo.

My initial results showed a performance improvement of 0.3% when qualifying an object with an owner. Given the non commital results I reran the experiment several times. In some cases the difference in performance was virtually nothing, in others there was a slight bias towards qualified objects and once the unqualified procedure ran slightly faster.

In short my personal view is that qualifying objects with their owner is something that should be done for reasons of being explicit in what you are asking SQL to do rather than for performance reasons.

Conclusions

When considering qualifying objects with their owners the main point to consider is how many different logins access objects in your database. In my experiment I had a single non-dbo user, but a more realistic environment may have multiple users.

Chris's article mentions benefits gained by accessing cached execution plans. With multiple users hitting the system I suspect that the benefits of this would be more pronounced.

In my SET NOCOUNT experiment the difference in performance between SET NOCOUNT ON/OFF over 10,000 itterations was measurable in milliseconds rather than seconds. If your system is not busy then this is small beer however in highly stressed systems every millisecond consumed by one process is a resource denied to another. As I said earlier, my results were gained by measuring multiple itterations of a single operation. The performance gain on more complicated procedures may be more pronounced.

Rate

4.63 (16)

You rated this post out of 5. Change rating

Share

Share

Rate

4.63 (16)

You rated this post out of 5. Change rating