Poor execution plan within insert trigger for UPDATE to base table having primary key on identity column as the number of inserted rows varies

  • I have a table t1 with a primary key on the identity column of this table. Within the insert trigger of t1, an UPDATE statement is being issued to update some columns of the newly inserted rows, the join being on t1.primarykeycolumn and inserted.primarykeycolumn. When the number of inserted rows starts to creep up, I have noticed 'suboptimal' execution plans. I guess the optimizer is referring to the statistics on t1 to arrive at the execution plan. But for newly inserted rows, the statistics page is always going to be stale, after all the IDENTITY columns is always going to be monotonically increasing when the SQL Server is supplying the values. To prove that statistics are the 'issue', I issued an UPDATE STATISTICS command as the first statement in the trigger and the optimizer is able to come up with a very good plan for a wide variety of rows. But I certainly cannot issue UPDATE STATISTICS in production code for a mostly OLTP system. Most of the times, the number of rows inserted will be in the few tens and only occasionally in the couple of thousands. When the number of rows in the tens, the execution plan shows only nested loop joins while it switches to using a series of Merge Joins + Stream Aggregates at some point as the number of rows starts to creep up. I want to avoid writing convoluted code within the trigger, one part for handling large number of rows and the other for the smaller number of rows. After all , this is what the server is best at doing. Is there a way to tell the optimizer 'even though you do not see the inserted values in the statistics, the distribution is going to be exactly like those that have been inserted before. please come up with the plan based on this assumption' ? Any pointers appreciated.

    Edit :

    One more interesting tidbit.. If the table t1 has no statistics to start with, then each execution of the trigger for any number of rows produces the correct execution plan. There need not be a 'UPDATE STATISTICS' within the trigger. If the table t1 has statistics upto date, then the trigger execution plan is not optimal if the number of rows inserted is higher. I find this hard to understand. My environment is MS SQL 2008 R2 SP3.

    Edit2 :

    My observation above is correct. I did a few more tests. In the absence of a statistics, the optimizer is coming up with optimal plans for a very wide number of rows. It is only when the statistics are updated before issuing the inserts (i.e when there are statistics are available), the optimizer starts to refer them and comes up with 'bad' plans on the joins inside the trigger as the number of rows starts to go up. Is there a way to tell the optimizer 'ignore whatever is on the statistics page. go do whatever you were doing in the absence of statistics" ?

  • Why can't you use INSTEAD OF trigger?

    Then you could modify rows from [inserted] the way you wish, and only after that put them into the table.

    No updates, no joins, no bad plans.

    _____________
    Code for TallyGenerator

  • Thanks. I had already started doing some tests in this direction and you recommending the use of INSTEAD OF triggers as a solution gives some assurance to me that I am on the right path .

    A quick question. Do you know where the 'inserted' rows that are available within the INSTEAD OF trigger reside, in memory or disk ?

  • I'm sorry, maybe I missed something.

    You are inserting a row, and then are calling a trigger to update that same row?

    Why doesn't the initial insert simply insert the values you want?

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • It is too long a story. The 'need' is to not touch the application code/and or stored procedure doing the inserts. So it is left to the triggers to generate the value for the column.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply