• GilaMonster (6/26/2009)


    Grant Fritchey (6/26/2009)


    I wanted to be sure of the answer before I posted it, but it seems that SQL Server will most frequently pick the index with the most columns. I tried disabling them, changing the order of columns, varying the number of columns included in the query, but it generally went for whichever index had the most columns in the include clause at the time the query was run.

    That's odd, I would have thought it would go for the narrower index because it would incur the fewest IOs.

    Time for some experimentation and maybe a blog post?

    Not a bad idea. I'd need to do more experiments before I would try to claim I understood what I saw.

    Every time I ran the query, it picked the widest set of includes, regardless of the order in which the indexes were created. I didn't check reads or timings.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning