September 9, 2008 at 3:22 am
Hello,
One of my clients has a difference in performance between an application on a terminal server and the same application on a fat client. I have been tracing and found the following:
I traced SQL batch completed and the trace shows that the duration of most queries is longer when a fat client is used. I want to be able to explain the difference.
What exactly is being measured with 'SQL batch completed'? Does this include network time to get an ack from the client or is it really the time the SQL server needs to handle the query?
I hope someone has an answer. Thanks in advance
September 9, 2008 at 6:48 am
Batch complete and RPC complete measure the time on the SQL Server. They don't, necessarily, include network time. The reason I put a caveat there is because very large data sets or transactions that include client side processing can be affected by stuff outside the SQL Server.
As to your issue, I'd check that the connection settings for both applications are the same. I'll bet one of them is not using the same options as the other. You can check the connection settings using the Sessions:Existing Connection event in Profiler. It lists all the settings in the TextData column. Here's an example:
-- network protocol: LPC
set quoted_identifier off
set arithabort off
set numeric_roundabort off
set ansi_warnings on
set ansi_padding on
set ansi_nulls on
set concat_null_yields_null on
set cursor_close_on_commit off
set implicit_transactions off
set language us_english
set dateformat mdy
set datefirst 7
set transaction isolation level read committed
Just compare the two.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 2 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply