Is there a way to quantify what extra overhead that TDE might be causing?

  • I am being asked to see if I can quantify what increase in CPU load that TDE is causing on my server.

    Is there a way to do this?

    i.e. with perfmon or something else.

    I am also being asked to capture cpu % usage over time ..

    I'm sure there is a way to do this but I'm not familiar with SQL enough to how to set that up.

  • yakko_Warner (12/1/2015)


    I am being asked to see if I can quantify what increase in CPU load that TDE is causing on my server.

    Is there a way to do this?

    i.e. with perfmon or something else.

    I am also being asked to capture cpu % usage over time ..

    I'm sure there is a way to do this but I'm not familiar with SQL enough to how to set that up.

    With perfmon, you can capture the information to a file. You'd probably want to capture a couple different CPU utilization numbers, perhaps one per-core and the "total" value. Do this for a week or so before turning on TDE, then again for the same amount of time after.

    Bear in mind, the results will still need to be interprepted, simply because the workloads over the time you record aren't likely to be the same.

    Also watch how frequently you have perfmon record the data, every second and you're going to have a monstrous file to look over, every 30 minutes would be fairly useless for anything...

  • Perfmon files are pretty small, I usually use the default of 15 seconds, run for a day or two and end up with a file in the tens of megabytes.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Well, that would have been a good thought but we have already turned it on and turned off the old server.

    :crying:

    Also are their SSIS or some other sort of Stats collection jobs out there that can track cpu usage.

  • For CPU usage, use perfmon. It can track per core, for a specific process or overall.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • okay, but their isn't a way to track just the TDE cost or is it just straight cpu %?

    I presume just cpu % with getting to peak at the tde portion of the transactions.

  • yakko_Warner (12/1/2015)


    okay, but their isn't a way to track just the TDE cost or is it just straight cpu %?

    I presume just cpu % with getting to peak at the tde portion of the transactions.

    Remember, TDE is "always on," there's no portion of a transaction that relates to TDE. Because it encrypts the data "at rest," it's the actual file on disk (the MDF / LDF / NDF) that's encrypted.

    Really, if you've already encrypted the DBs with TDE, there's no easy way to check how it's impacting your CPU utilization, short of decrypting the DBs, monitoring for a while, then re-encrypting.

  • If you want to know the overhead of TDE, you can use (distributed) replay. That is, first capture a reasonable amout of workload with a trace. Then set up a test machine where run the workload multiple times with help of Replay. For some of the runs you have the database enabled with TDE, and for some not. Measure CPU as suggest by Gail and others in the thread.

    I have never used Replay myself, so I cannot give the exact details. But it goes without saying that running these tests will keep you busy for a day or two.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • I think you will discover more noise than difference, but I could be wrong:-D. My systems run CPUs hot and use TDE, but I am certain my systems' CPU constraints are rooted in much lower hanging fruit (such as execution plans and execution rates).

    It is extremely easy to decrypt a database and check its encryption status (depending upon size, a decryption can last hours). RML and ostress can also simulate concurrent activity.

  • bit late to the party here but

    • you can track read and write I\O per database file using the DMV "sys.dm_io_virtual_file_stats"
    • you can get CPU utilisation by database using "sys.dm_exec_query_stats" and "sys.dm_exec_plan_attributes"

    Check Glenn Berry's queries for more info

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • To quantify the overhead cost of TDE, one approach is to run a test workload, something like replaying a profiler trace, against both a TDE version of your database and then against a non-TDE version of your database. To make this more manageable, perhaps create a database called [TestNonTDE], and copy in a manageable subset of tables (with indexes and statistics) that will cover your test workload. This database will remain un-encrypted. Next, backup this test database, restore under the name [TestTDE], and then setup TDE on that database. So you're comparing baked apples to raw apples, you'll want to host the test databases on the same instance and storage (not necessarily production), and also run the workload test system off-hours when there is no other activity. Also consider re-starting MSSQL service before each test workload to clear the buffers and reset DMV stats.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply