simultaneous inserts

  • Hi ,

    If I have a log table with around 30 columns in database DB1.
    Now, from 100 other databases DB2 - DB101 , the stored procedures  are inserting into the above log table as and when the SP run.
    Worst case there could be 100 simultaneous inserts into this one log table.
    The SP, during their execution, would insert into log either 1 record or 10 records or 100 records(or 500 records).

    Will SQL Server be able to handle this by queuing insert request of each SP and executing them SP one by one. Or will there be any challenges?
    All the databases are in the same server.

    Thanks

  • SQL Server can handle this, but there isn't queueing per se. Instead, you potentially get blocking at the page level. In order to perform an insert, a process acquires a lock on the page(s) where the data is written. Anyone else trying to insert to this table is blocked and delayed until the lock is released. At this point, the next process could then acquire its lock for the insert and work.

    This is displayed  in the output of processes (sp_who2 is commonly used) with a blocking tree as each process is blocked by others.

    If you exceed the lock threshhold, you could end up with a process getting a table lock, which could potentially block more people.

    The lock is needed for the insert to take place and indexes to be updated.  You can speed up things by ensuring that you have minimal indexes and fast storage for your logs to be written. Most of the inserts will get written to disk in the log and in memory for the table, flushed to disk later.

    The challenge is workload here and resources to manage this. Just like having hundreds of users trying to insert records, you need enough hardware allocated to handle your load.

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply