reduce impact performance impact on the OLTP SQL database

  • Hi,

    Our client occasionally needs to load very large transaction files into their OLTP SQL database during periods of heavy “live” API load. How would i structure these imports to reduce impact performance impact on the OLTP SQL database?

    Thanks

  • The short answer is, it depends, on far too many factors for anyone to say definitively what would be best for your environment.

    Generally speaking you'll want to keep blocking to a minimum in any high volume transaction processing system. This usually means limiting the number of rows you attempt to apply from your large files into your transactional tables within a single transaction to a minimum. Some systems completely avoid set-based processing meaning all rows entering the system enter one at a time regardless of whether they arrived in a file or from an interactive user hitting a front-end, and all data might even flow through the same set of interface stored procedures that effectively only process one row of data at a time.

    Mostly what I have found though is that most systems support a blend of single-row transactional processing, usually coming from interactive applications (e.g. websites) where the bulk of activity occurs during normal business hours. Then, multi-row set-based batches are applied by backend processes during night-time hours as not to conflict with the interactive users as much as possible.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

  • Thanks

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply