Severe but intermittent performance problem

  • I have an application that processes text files, extracts some data, and stores the data in a SQL-Server database. All the work with SQL-Server is done via dynamic SQL from a C# application. The app has been running for several weeks now, and generally works well, processing a file in 10 - 15 seconds (adequate for our needs).

    The last couple of weeks, the application has slowed to a crawl. The effect has occurred on a Monday and lasted from 1 to 3.5 days (and increasing at the moment). SQL-Server's performance drops dramatically. At first I thought the system had hung, but it is still processing the files. But it takes 45 minutes to an hour to process a file.

    The error logs are not reporting any problems. When I check the current processes, there are no conflicting locks or blocked processes. In fact, the only processes noted are from the application, EM, and the standard system processes. You can watch the individual insert processes from the application work their way through, but each one takes several seconds.

    The server machine is used for a number of purposes. But Task Manager does not show any process where the processor, memory, or disk load appears to be sufficient to be causing these problems.

    Any help on how I might determine the cause of the problem will be greatly appreciated.

  • What I would check is the indexes and the fragmentation via DBCC SHOWCONTIG at the first glance.

    Also try to trace the file loading processes.

    I don't know what you are doing diring the file load process but did you consider the use of BCP or DTS to load the data (eventually into a work table) after doing the data processing wtihin the database itself.

    And as a hint: it is never a too good idea to use your SQLServer for other tasks as well



    Bye
    Gabor

  • In addition to checking indexes and fragmentation:

    One of our clients once experienced what you describe. In their case, it was caused by not updating statistics. As a result, the Query Optimizer was happily using old plans, created for tables with few records. They created a maintenance plan to update statistics and reorganize data/indexes at night, and now the client is as happy as a clam.

    Leif

  • Indexes and statistics are definitely worth checking. However, I would also consider rewriting the code for the text file processing into a stored procedure as well. Even if you keep it as dynamic SQL, you get a bit more caching.

    Michael

  • Thanks to you all. Rebuilding the indexes and updating the stats seem to have cured most if not all the trouble. Further testing will tell for sure. And it looks like we may be able to move the system to a dedicated server. Thanks again for your help.

  • if your SP take huge time to execute .. then you have to check these options that may help u

    1- take care that your SQl server doesn't have any other programs that run with it .. so that machine Having only SQL Server installed

    2- recompile your stored procedure

    3- update Statistics to tables or views that uses this Stored procedure

    4- reindex your tables .. so your query get faster

    5- add any other index to your table .. to get beter performance

    6- with each table name in your select query, add this statement "with (nolock)" .. so you read data as it is (not recently updated = not commited transaction), and you don't wait for locks to be free and you don't lock these tables

    I hope this help u


    Alamir Mohamed
    Alamir_mohamed@yahoo.com

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply