Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Best index structure for table that holds a large amount of log data Expand / Collapse
Author
Message
Posted Wednesday, October 31, 2012 1:36 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, January 9, 2013 3:27 PM
Points: 41, Visits: 51
Folks
I have a ASP.NET solution that needs to log every time a method is invoked.(that part I have done)
I am logging this activity into a table
I had constructed a table that has a identity filed that is also the Clustered Primary Key

When we do load testing and the number of clients is greater than 20 I get errors (which I also log to another table).
The error is "The underlying provider failed on Commit"
Inner Exception is: "Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding"

I also SQL profile these tests

Under 10 users inserts are under 1500 msecs
Over 20 users inserts are about 6000-9000 msecs
At 50 users inserts are over 30000 msecs

This table is never read by the application and it has only one index the clustered index.

So how can Index this table to reduce the insert time and avoid this exception?
Or should the table have no index?

regards

GregJF
Post #1379175
Posted Wednesday, October 31, 2012 2:05 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Friday, July 18, 2014 9:09 AM
Points: 870, Visits: 2,385
If you need an index look at creating a non-clustered index rather than a clustered index, with a high fill factor (90-95%) to prevent a large number of page splits occuring.

_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1379183
Posted Wednesday, October 31, 2012 2:05 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Today @ 3:14 AM
Points: 42,459, Visits: 35,520
1500ms is incredibly high for an insert, unless you're inserting a million or so rows. Can you post more details, including an execution plan?


Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1379184
Posted Wednesday, October 31, 2012 8:14 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, January 9, 2013 3:27 PM
Points: 41, Visits: 51
Gail
Yes and the 1500ms is just me filtering out quick running(ie < 1500ms) queries
I have 100s of inserts for this command(Sp_excute_SQL from .net EF4)

I run the Load Test for 85 mins.
25 minutes into it, I start getting these timeout errors (the test emulate 10 user for time then 20 the 30 on up to 50)
Also at that time the inserts for this table are getting longer and longer.
Start at 7000ms then 8000ms then 9000 ms then jump to 20000ms 22000ms then jump to 30000ms to 35000ms.

I have attached the Plan and the prepared statement, but remember this works fine, except between 25 and 26 min
The last 30 mins(between 55 mins and 85 mins) has 50 "robot" users doing lots of queries and inserts.

Load: 175511 insert in the 85 mins, of which 802 were between 25 and 26 min
Long running 200 > 700ms to insert , of which 40 between 10000ms and 25000ms and 15 greater than 25000ms ( all of these at 25 minutes into Load Test)
Errors: 49 ( all at 25 minutes into Load Test, when user load went from 10 users to 20 users)

If seems that it may be something else.
So what should I look for?

I have a full SQL profiler of all actions from that application against that database as well as the profiler that isolate queries greater than 1500ms
I also have Perfmons on the SQL Server and the 2 "client" computers.
regards

Greg Frazer


  Post Attachments 
QueryLCA.txt (8 views, 1.77 KB)
LogACA.sqlplan (4 views, 24.28 KB)
Post #1379587
Posted Thursday, November 1, 2012 12:32 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 6:43 PM
Points: 36,752, Visits: 31,208
The code is inserting into the table and then unneccessarily reading from the table. Try the following, instead...

exec sp_executesql N'insert [dbo].[LogApplicationControllerAction]([ApplicationName], [ControllerName], [ActionName], [RouteData], [RequestSessionID], [SerialisedType], [LogData], [ReceivedTime], [CreatedBy], [CreatedDate], [UpdatedBy], [UpdatedDate])
values (@0, @1, @2, @3, null, @4, @5, @6, @7, @8, @9, @10)
select [LogApplicationControllerActionID] = scope_identity()',N'@0 nvarchar(300),@1 nvarchar(120),@2 nvarchar(120),@3 nvarchar(max) ,@4 nchar(270),@5 nvarchar(max) ,@6 datetime2(7),@7 nvarchar(128),@8 datetime2(7),@9 nvarchar(128),@10 datetime2(7)',@0=N'DnBConnect',@1=N'search',@2=N'ShowCompanyImage',@3=N'',@4=N'DnB.Connect.MessageContentLibrary.ControllerActionLog

If you can't get the ORM to use the code above, write a CRUD stored proc in a manner similar to the above and call it instead.

And, no... I wouldn't change the clustered index you have except you should consider changing the FILLFACTOR to 100. There are no updates to this audit table and the CI on the IDENTITY column will actually prevent page splits. No sense in saving blank space on a page for such a thing.

Something else to make sure of if you haven't looked in a while... make sure no one screwed up and put a trigger or FK on this audit table. Might want to double check and see if someone added another index or two, as well. Can't tell from here but another thing to look for is to see if someone put an indexed (materialized) view on the table. Like Gail said, the duratio of 1500ms is terribly high for a single insert and the things I listed could be the cause.

One other thing. Have you checked the datatypes of the query against the data types of the actual table columns. That could be yet another source of ms although it shouldn't be that significant for an unjoined insert.

(Note: all the questions above could be answered if I could see the query plan you attached. I can't because I'm working from a 2k5 machine right now).


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1379640
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse