does log size has any impact on performane of database

  • If you are loading often, you're creating a hot spot on the disks. Logs could be an issue here.

    I'd get the logs onto separate disks, a small R1 array could help here.

    I'd also consider a separate filegroup and load the data into tables there, then use SQL to move it over. You could drop that separate filegroup on its own disk array as well.

    Ultimately, I think Gail's partitioning strategy might be the best thing, swapping in and out as you load.

  • Chris Harshman (9/27/2008)


    OK, so we have new information, I didn't realize before you were doing bulk loads every minute. How are your disks setup, RAID level etc. How is your I/O performing? There could be an I/O problem preventing the system from having good performance. Do you ever delete records from this database? You mentioned before that you only have last 2 days of data. Deletes could become very slow on this kind of setup.

    Thanks Chris for the reply

    Insertion is happening for every 30 seconds and it will be inserting 100 records per batch. It is running 24*7 there will be no stoppage time. Cpu speed 3.3 ghz(dual core) ,4 GB ram, 160GB

    Yes I do delete the records every day at around midnight. The whole process takes around 8 minutes.

    I transfer the records to appropriate table and delete it. This is done using SSIS.

    I have seen reply of using partitions, but tats not the remedy.

    Any other way.

  • Hemalatha (10/3/2008)


    I have seen reply of using partitions, but tats not the remedy.

    Why are partitions not appropriate?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • even i am doing the same task inserting lakhs data into the database.But i am facing a dead lock problem while inserting these much of data.Can ypu please help me how your inserting the 10 lakhs data without getting any deadlocks

  • Please post new questions in a new thread. Thanks.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 5 posts - 16 through 19 (of 19 total)

You must be logged in to reply to this topic. Login to reply