The Game is Afoot

  • Comments posted to this topic are about the item The Game is Afoot

  • I like what you've written, Andy. I find it good to know how other people tackle problems like this. For myself, whenever something I'm involved in goes wrong, I always start with what have I done. My first assumption is I've done something wrong. I start from there. Often, I'm correct, its something that I've written that's wrong. Once I get what I've done wrong fixed, things often improve. What trips me up is when the cause of the problem isn't something that I've done. I'll spend a couple of days looking for what I've done wrong, before I finally realize it wasn't me.

    I hope other people respond to this as I'd love to learn how other people approach this.

    Kindest Regards, Rod Connect with me on LinkedIn.

  • Test data. Even if using live data, reduce the throughput of what is being processed, so you can make reasonable assumptions about what should/should not be changing at any point, then review it. If you have a solution for your test data, expand the throughput until it doesn't work, find records that didn't work the way you expected and figure out why they're different from what you have. rinse/repeat. sometimes that leads you to assumptions that were incorrect, sometimes not.

    If you're really head-bangingly stuck, escalate it to management and let them know you need someone's assistance. May not even help, but at least you've tried.

    If all else fails, pick up the phone and call Andy 😉

    -------------------------------------------------------------------------------------------------------------------------------------
    Please follow Best Practices For Posting On Forums to receive quicker and higher quality responses

  • I'm glad that someone else still likes SQL Profiler. 😀

    As for that 10-15% CPU thing?  The only time I don't cringe at that is when a large batch job is running. If that's the norm, then Murphy's half brother, "Sum Ting Wong", has been hard at work and needs to be fixed.  Almost always (like 99.999% of the time), it's bad code along with a possible missing or improperly written index.

    As it is, we normally have only 8% and 3% of that is because of an annoying piece of turd party software and we're beating on the vendor to make improvements.  The front end makes the same call to the backend 14 million times per hour.  That's not a made up over exaggerated number.  That's how many times the code runs... and with it not being parameterized and has one literal in it that always changes, it always recompiles on top of everything else.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Thank you all for reading and the comments! Jeff, I think that 14 mil beats anything I've seen, that is a lot.

  • Andy Warren - Sunday, September 2, 2018 9:10 AM

    Thank you all for reading and the comments! Jeff, I think that 14 mil beats anything I've seen, that is a lot.

    It's the most I've ever seen, for sure.  Since I've been complaining about this 3rd party vendor problem for 3 years and watching nothing get done and it now being a serious problem that's interfering with our processes, I wrote some Proof-of-Principle code and sent it to the people that now suddenly care.  It took me 3 iterations but the code now runs just once per document, does everything it needs in one fairly short stored procedure, and runs literally 960 times faster (125ms instead of more than 2 minutes per document.

    SQL Server squawked at me for my code (a previously tried and true nasty fast method with fewer levels for something slightly different) being too complex to create an execution plan even the code.  I was gob smacked because it only consisted of only really super simple simple cascading CTEs.

    The rewrite to simulate that same code in a manner that SQL Server could actually handle came at a cost.  It took 2-9 seconds to compile and SQL Server saw fit to recompile it every time it was called.  While that is still 13 to 60 times faster than the original 3rd party code, it wasn't good enough for me, especially because of the recompiles.

    I stuck with it and seriously changed tactics.  I ended up with some fairly short code that only takes 125ms per document and doesn't need to recompile even in the face of changing parameters for every document. 

    So, to answer some of your questions from the article....

    While you’re thinking about it, think about this - how do you react when the problem is hard. The can’t find a place to start kind of hard, the hours and hours into it and still no progress kind of hard. Do you stress out? Blow up? Give up? Grind through? Or can you smile at a challenge worthy of you, enough to make you exclaim “Watson, the game is afoot!�

    I could have quit on the first iteration when SQL Server told me my code was impossible to parse.  I've been at war with the optimizer for 22 years and I wasn't about to let it win this one.   The optimizer actually works pretty well when you apply liberal doses of pork chop grease to it.  😀

    I could have settled on having a 2-9 second compile time during the second attempt and have the code be 13 to 60 times faster than the original code.  Even if it wasn't due to constant recompiles, that still seemed pitiful to me especially since I know that we need to double the scale of all of this in the next year or so.  I wouldn't blame someone else for settling especially since I did it all on my own time yesterday (a Saturday on a holiday weekend).  It's just that this company has been good to me and the people I work with are good folks and it was simply my turn to turn to.  It's also not my favorite system and I really wanted to do it right the first time so I wouldn't have to revisit it in the future.  (Enter... motivation :P)

    The third try was really tough because I had to scrap 90% of the previous code and start mostly fresh.  I abandoned all previous preconceived notions and went with the "Ok... pretend you've never seen this type of problem before" notion and it worked.  At 960 times faster and all in a relatively small single stored procedure that gets rid of a ton of other code, it was all worth it.

    There's also a sense of pride in craftsmanship that goes into all of this.  To be honest, a lot of people didn't know how to solve the problem (and neither did I until I tried and got some "how it currently works" information from my boss and he's another reason why I like working for this company ;)).  They also scoffed and used the word "impossible" when I told them that I thought it could actually be done all in T-SQL.  Heh... yeah, no... I couldn't let that go unpunished even though providing them with a solution isn't really a punishment. 😛  It's that pride in craftsmanship thing again.  It's kind of like the statue carver that says, "There's a horse in this hunk of jagged rock... I just have to find it". 😀

    To paraphrase to my own liking the motto of the Army Corp of Engineers (which I'm not affiliated with), "The difficult can be done immediately... the impossible takes slightly longer".

    I'll also agree that sometimes weeks long interruptions make such tasks take a lot longer and even with really good notes and comments in the code, when you finally do get back to something like this you end up looking at it all and realizing that you've just become Chief "Werda", Chief of the totally lost "Fuhgowee" tribe. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff, that's a story worth sharing as an article or editorial, should talk to Steve about it. I think too often people think that all problems can be solved easily and get frustrated/think they aren't good enough. Seeing a pro sweat is, I think, reassuring!

  • Jeff Moden - Sunday, September 2, 2018 4:04 PM

    I could have quit on the first iteration when SQL Server told me my code was impossible to parse.  I've been at war with the optimizer for 22 years and I wasn't about to let it win this one.   The optimizer actually works pretty well when you apply liberal doses of pork chop grease to it.  😀

    You Sir, are a badass.  I like this mentality.

    -Grant Roles

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply