Logging table performance

  • I have several processes that need to log what they are doing. Each process will have basic information plus unique information depending on the process. Each log row will have a unique transaction id

    I have several choices in how i store the data.

    1) One central table that would contain all common data with smaller sub tables, one for each process to store unique information

    2) One small central table that would contain only the transaction id with larger sub tables containing all information about the process.

    3) one table to contain all information. In this case some columns may not contain data if the column does not pertain to a given process.

    My dilemma is the performance factor. SQL Server may have to contend with several hundreds of thousands of log rows per day. Could be up to 800,000 in a 24 hour period. Along with all other updates and deletes etc. All this is happening on an Azure SQL server. This is my first foray into the Azure world so not sure about performance and how much i can change the configuration of the sql server to help. Can i create multiple tempdb for instance.

    Any help on the above dilemma would be greatly appreciated.

Viewing 0 posts

You must be logged in to reply to this topic. Login to reply