design/deployment stragety for handling huge transaction tables?

  • Hi All,

    I have a requirement to handle a transaction table with 50,000+ records /hr

    (expected minimum) and should also be able to generate summary reports based

    on these transactions .

    What all db design policies should I follow here to achieve this?

    Do I need to implement archival / table partitioning here?

    Can I have indexes on this table?

    I really appreciate your expert advice on this.

    Thanks

    Regards

  • you absolutely need indexes, but minimize the number of them.

    Paritioning can help if you have some basis for partitioning (times, IDs, something).

    You might consider using a second table that summarizes the data in the primary one and querying off that one.

    It really depends on the requirements for this table, what is it going to store and be used for.

  • What would be the archival policy? Do we need to go for partitioned views?

    Regards

  • What are the business requirements. It is hard to advise unless we know what the application needs to do.

    You have posted about the transaction table. Is this table the master table that gets updated bu OLTP transactions, or is this table holding only transactions that get applied to a separate master table.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • Hi,

    Transaction table i am refering to is an OLTP table where records will get inserted. An ideal scenario would be Google Analytics tool that captures the hits and insert those information to a transaction table, based on those transaction the summary reports are being generated. I can give one more such scenario where mobile service providers captures each and every call information and billing is being done based on that data, there could be millions of transactions every day so our transaction table should record all these with out any performance bottleneck. I am looking for a physical/logical design for this scenario.

    Please let me know if you need any more info to help me out

    Thanks for your time

  • You need to take a hard look at the business requirements and design your system accordingly.

    There is a very big difference between designing a system to cope with a peak of 50,000 inserts an hour or requiring a steady load of many millions of inserts per day.

    If you really need to cope with many millions of inserts per day, then you will be at the leading edge of what SQL Server and Windows can cope with, but within comfortable limits of a (very) few other DBMS systems. On the other hand, a peak load of 50,000 in an hour with a daily average of 500k inserts is easily within SQL Server capabilities.

    If you over-design this system you will be spending many $100k or $m more than needed. If you under-design you will not meet your business needs. So tell us what the business really needs and you could get some really good advice.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

Viewing 6 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply