Are the posted questions getting worse?

  • Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    I do the same.

    Small bug in a T-SQL script?

    Whoops, I reformatted all of the code and did some performance tuning 😀

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • That makes me feel somewhat better. Thanks, Koen.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • BTW, can anyone with Visual Studio coding experience take a look at my question here?

    I've been fighting this on and off for several months now and just can't seem to find the information I'm looking for.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Koen Verbeeck (12/17/2014)


    Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    I do the same.

    Small bug in a T-SQL script?

    Whoops, I reformatted all of the code and did some performance tuning 😀

    +1. Isn't that normal? When you have a problematic process, you should consider the whole process and tune. Making it handle larger sets more efficiently is always a good thing. So it has a small set now...that doesn't mean they won't get bigger. In general, data tends to grow over time, so why not design and code properly in the first place?

  • Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    I think what you did was the professional and appropriate thing, without and shadow of a doubt. The files may be small *usually* - but then what about the time(s) they're not?

    I'm a DBA.
    I'm not paid to solve problems. I'm paid to prevent them.

  • Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    Doing all of them at once while everything is fresh in your mind is the smart thing to do, especially since you said you modified the underlying tables.

    And "Isn't giving us problems" isn't the same as "fast". You probably saved yourself headaches in the future by getting this right now.

    --------------------------------------
    When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
    --------------------------------------
    It’s unpleasantly like being drunk.
    What’s so unpleasant about being drunk?
    You ask a glass of water. -- Douglas Adams

  • ... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...

  • Lynn Pettis (12/17/2014)


    ... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...

    What is the big news? What is this counting for? 🙂


    Sujeet Singh

  • Divine Flame (12/17/2014)


    Lynn Pettis (12/17/2014)


    ... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...

    What is the big news? What is this counting for? 🙂

    I believe this is how many days he has left on his Afghanistan contract.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Brandie Tarvin (12/17/2014)


    Divine Flame (12/17/2014)


    Lynn Pettis (12/17/2014)


    ... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...

    What is the big news? What is this counting for? 🙂

    I believe this is how many days he has left on his Afghanistan contract.

    Oh, getting back to home 🙂 That sure is a thing to count for 🙂


    Sujeet Singh

  • Divine Flame (12/17/2014)


    Brandie Tarvin (12/17/2014)


    Divine Flame (12/17/2014)


    Lynn Pettis (12/17/2014)


    ... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...

    What is the big news? What is this counting for? 🙂

    I believe this is how many days he has left on his Afghanistan contract.

    Oh, getting back to home 🙂 That sure is a thing to count for 🙂

    Yes, I am coming home from Afghanistan. Even better, I still have a job with my company when I get home.

  • Koen Verbeeck (12/17/2014)


    Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    I do the same.

    Small bug in a T-SQL script?

    Whoops, I reformatted all of the code and did some performance tuning 😀

    I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden (12/17/2014)


    Koen Verbeeck (12/17/2014)


    Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    I do the same.

    Small bug in a T-SQL script?

    Whoops, I reformatted all of the code and did some performance tuning 😀

    I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.

    And that PC stuff is a skill that I haven't yet mastered. "Yeah, your stuff sucked, so I used the opportunity and my expertise to give it a tune-up."

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • WayneS (12/17/2014)


    Jeff Moden (12/17/2014)


    Koen Verbeeck (12/17/2014)


    Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    I do the same.

    Small bug in a T-SQL script?

    Whoops, I reformatted all of the code and did some performance tuning 😀

    I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.

    And that PC stuff is a skill that I haven't yet mastered. "Yeah, your stuff sucked, so I used the opportunity and my expertise to give it a tune-up."

    Heck, I'm more diplomatic than this, and that's saying something.

    But I agree Brandie, you didn't do anything wrong.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • WayneS (12/17/2014)


    Jeff Moden (12/17/2014)


    Koen Verbeeck (12/17/2014)


    Brandie Tarvin (12/17/2014)


    Brandie Tarvin (12/16/2014)


    WHOO! #HappySnoopyDance

    I just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.

    Current execution time ... < 5 minutes.

    "Celebration Time, come on!"

    Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"

    Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.

    And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.

    But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?

    Or am I just being too efficient?

    I do the same.

    Small bug in a T-SQL script?

    Whoops, I reformatted all of the code and did some performance tuning 😀

    I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.

    And that PC stuff is a skill that I haven't yet mastered. "Yeah, your stuff sucked, so I used the opportunity and my expertise to give it a tune-up."

    I wouldn't say it sucked. It was a damn good first try. The process had been in production for almost 2 years before it started having issues. Then our boss told us it needed to be optimized. We were both working on other projects at the time and this fell to the bottom of the list. I just happened to get to it before coworker did.

    But yes, hurt feelings I get. I've had my boss call me over to his desk because my code forgot something or did something it wasn't supposed to do.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

Viewing 15 posts - 46,621 through 46,635 (of 66,573 total)

You must be logged in to reply to this topic. Login to reply