Continuous Delivery In Real Life

  • jckfla (7/16/2015)


    If consumers would more readily reject what wasn't exactly what they need or what doesn't work right, manufacturers would have no option but to make sure quality and maturity is there before releasing their wares.

    Hard to do that from a consumer point because of the brand loyalty or limited options available to you that's similar to the game you want to play. It's the same in my area with only really having one ISP available. We have no choice but to suck it up and give them our money.

    But yes, as you mentioned before, with not having to develop your own game engines these days and reusing previous proven technology as Microsoft surely does as the shell for your project will surely speed up the production pipeline. I forgot to mention the reuse of code in my last post, but that's ideally why Microsoft and many others are allowed to be as fast as they are because they are just making sequels and nothing ideally from scratch.

    Even if they did start from scratch, they have access to good reference points that others may not have.

  • Steve Jones - SSC Editor (7/16/2015)


    CI/CD also doesn't mean no QA. Your CI builds that pass automated tests should then be available for QA people to run further tests, whether complex Selenium (or other) tests, or manual stuff. You need to be sure that you are producing something that doesn't work.

    It's a good idea, but like anything, some will abuse it to try and make a few $$.

    Those examples I gave had QA too. It's not a problem of not having QA. It's the problem of QA keeping up when the wheel are turning. CI/CD is nothing but a big wheel constantly turning and churning out results in a seamless environment. It's very easy to get ran over if you're moving too fast. 😛

  • Steve Jones - SSC Editor (7/16/2015)


    Eric M Russell (7/16/2015)


    Where I've been working for the past few years, we develope ETL and other supporting applications for a data warehouse. The source data originates from about 200 clients, the ingest files are not all standardized, and there are client specific metrics, programming, and custom reporting. Therefore, an XP development lifecycle and Continuous Delivery is not only routine but required. We just can't bundle all our various micro projects into one deliverable. At any given moment they have to be integrated, because a small change for one client can potentially break the entire ETL process for all clients if it's not planned, coded, and tested properly.

    Testing method? tSLQt, Selenium, MS Unit Test framework, something else?

    Most of the client specific changes I do are so small in scope they wouldn't qualify as a "project" by themselves. For the purpose of timesheets and capitalization, the change will fall under the umbrella of a budgeted project code, but these tasks are coded, tested, and deployed individually.

    I'm not familaiar with unit testing frameworks; I typically do input / output unit testing in the development environment while I'm coding. The header for each stored procedure contains comments and same calls for the purpose of unit testing.

    Typically we perform integration testing for a unit of deliverable work by deploying to a stable QA envrironment that emulates production, and then we run injest files for that client end-to-end, from the initial drop folder all the way to the QA version of the data warehouse. A dashboard application monitors each step in the process, idenitfying failed steps or standard deviations in the dataset. The same process is repeated for a couple of other non-related clients that, because of their complexity and custom coding, are known to be sensitive to changes.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • xsevensinzx (7/16/2015)


    jckfla (7/16/2015)


    If consumers would more readily reject what wasn't exactly what they need or what doesn't work right, manufacturers would have no option but to make sure quality and maturity is there before releasing their wares.

    Hard to do that from a consumer point because of the brand loyalty or limited options available to you that's similar to the game you want to play. It's the same in my area with only really having one ISP available. We have no choice but to suck it up and give them our money.

    But yes, as you mentioned before, with not having to develop your own game engines these days and reusing previous proven technology as Microsoft surely does as the shell for your project will surely speed up the production pipeline. I forgot to mention the reuse of code in my last post, but that's ideally why Microsoft and many others are allowed to be as fast as they are because they are just making sequels and nothing ideally from scratch.

    Even if they did start from scratch, they have access to good reference points that others may not have.

    Yeah, and even when you have multiple options...it's still not guaranteed to be good. Several places I've lived, there was more than 1 high speed ISP. However, the faster one charged a lot. And, the slower one charged even more per Mb speed.

    But yeah, the rapid development seems to be thriving in today's software development world even though in a lot of cases it's lowered the product quality. Hopefully somewhere down the line, that will change.

    I know I would prefer a Windows release every 3-5 years if it means that the majority of bugs are worked out and most of the security loopholes have been plugged. I don't need a new version Windows every 2 years. Most of the world probably doesn't.

  • jckfla (7/16/2015)


    xsevensinzx (7/16/2015)


    jckfla (7/16/2015)


    If consumers would more readily reject what wasn't exactly what they need or what doesn't work right, manufacturers would have no option but to make sure quality and maturity is there before releasing their wares.

    Hard to do that from a consumer point because of the brand loyalty or limited options available to you that's similar to the game you want to play. It's the same in my area with only really having one ISP available. We have no choice but to suck it up and give them our money.

    But yes, as you mentioned before, with not having to develop your own game engines these days and reusing previous proven technology as Microsoft surely does as the shell for your project will surely speed up the production pipeline. I forgot to mention the reuse of code in my last post, but that's ideally why Microsoft and many others are allowed to be as fast as they are because they are just making sequels and nothing ideally from scratch.

    Even if they did start from scratch, they have access to good reference points that others may not have.

    Yeah, and even when you have multiple options...it's still not guaranteed to be good. Several places I've lived, there was more than 1 high speed ISP. However, the faster one charged a lot. And, the slower one charged even more per Mb speed.

    But yeah, the rapid development seems to be thriving in today's software development world even though in a lot of cases it's lowered the product quality. Hopefully somewhere down the line, that will change.

    I know I would prefer a Windows release every 3-5 years if it means that the majority of bugs are worked out and most of the security loopholes have been plugged. I don't need a new version Windows every 2 years. Most of the world probably doesn't.

    Windows XP will run SSMS, IE, Outlook, and Visual Studio the same as Windows 8; the experience is the same for me, because my daily routine involves tabbing between applications. Only a small percentage of my time is spent interacting with Windows, and even then it's usually because I'm trying to figure out how to do something in the latest version of Windows that I was already familiar with in the previous version. It's essentially the same set of core features functionality wise; they've just relocated somewhere else within the Windows desktop or menu structure. It's a lot like going to my favorite restraunt, ordering a meal I've requested dozens of times in the past, but the new waiter seems to be unfamiliar with taking my request. It seems to me that new releases of Windows are geared toward adapting to the GUI expectations of the next generation of new users, which makes sense from a marketing perspective, but for someone who uses Windows as a tool for getting the job done, I couldn't care less when the next release comes out.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • I remember reading about the development of the NT Operating System, almost 20 years ago.

    All developers were required to submit, on average, 250-300 lines of completed, tested source code per day.

    Builds were done nightly.

  • Eric M Russell (7/16/2015)


    jckfla (7/16/2015)


    xsevensinzx (7/16/2015)


    jckfla (7/16/2015)


    If consumers would more readily reject what wasn't exactly what they need or what doesn't work right, manufacturers would have no option but to make sure quality and maturity is there before releasing their wares.

    Hard to do that from a consumer point because of the brand loyalty or limited options available to you that's similar to the game you want to play. It's the same in my area with only really having one ISP available. We have no choice but to suck it up and give them our money.

    But yes, as you mentioned before, with not having to develop your own game engines these days and reusing previous proven technology as Microsoft surely does as the shell for your project will surely speed up the production pipeline. I forgot to mention the reuse of code in my last post, but that's ideally why Microsoft and many others are allowed to be as fast as they are because they are just making sequels and nothing ideally from scratch.

    Even if they did start from scratch, they have access to good reference points that others may not have.

    Yeah, and even when you have multiple options...it's still not guaranteed to be good. Several places I've lived, there was more than 1 high speed ISP. However, the faster one charged a lot. And, the slower one charged even more per Mb speed.

    But yeah, the rapid development seems to be thriving in today's software development world even though in a lot of cases it's lowered the product quality. Hopefully somewhere down the line, that will change.

    I know I would prefer a Windows release every 3-5 years if it means that the majority of bugs are worked out and most of the security loopholes have been plugged. I don't need a new version Windows every 2 years. Most of the world probably doesn't.

    Windows XP will run SSMS, IE, Outlook, and Visual Studio the same as Windows 8; the experience is the same for me, because my daily routine involves tabbing between applications. Only a small percentage of my time is spent interacting with Windows, and even then it's usually because I'm trying to figure out how to do something in the latest version of Windows that I was already familiar with in the previous version. It's essentially the same set of core features functionality wise; they've just relocated somewhere else within the Windows desktop or menu structure. It's a lot like going to my favorite restraunt, ordering a meal I've requested dozens of times in the past, but the new waiter seems to be unfamiliar with taking my request. It seems to me that new releases of Windows are geared toward adapting to the GUI expectations of the next generation of new users, which makes sense from a marketing perspective, but for someone who uses Windows as a tool for getting the job done, I couldn't care less when the next release comes out.

    Yeah, but it is getting less and less. For example, I believe it was starting with VS2012 (maybe VS2010) that you can't develop with it using Windows XP. You can use an older version of Visual Studio and develop to the .NET Framework 4 however from what I remember.

    Here, we have a small IT shop. 4 people plus the director. We all have mixed tasks, and one minute I can be programming and be asked to check a service setting in Windows, or check something in a SQL Server DB, or check for network or server latency issues, or remote to a machine running XP to load a project in VS2003, or migrate a project to VS2012 and update the System.xxx references and calls.

    But yeah, most of my day is spent in applications writing code or interacting with DB tables. But even still, the movement toward rapid major OS releases just seems silly. Hence, why I think having an OS come out every 2 years and require constant dropping of $100+ is kind of unnecessary.

    Just seems to me that rapid development/continuous delivery of software in the current era is more about revenue streams widening rather than usefulness of technology advancing rapidly.

    And as a tech person, that saddens me tremendously. I long to see innovation and advancement thrive once again and the tech market burst at the seams with new, exciting, and almost daily announcements of breakthroughs...rather than the shuffling of base functions of Windows configuration menus, and the changing of keystroke shortcuts (anyone else have trouble with Ctrl-Y not being Redo anymore? lol).

  • Ten years ago, when I was building out the team for a start-up, one of the first things we did was setup a continuous integration environment (CI). After a section of code was completed by a developer, they checked it in and the CI process started. Code was compiled, unit and end-to-end tests were run, and in the end, we had a fully built application. Problem and critical areas had more tests.

    But just because it was a fully built application did not mean we had to release it. What it did give us was the certainty that as a complete application, it all came together. That does not mean bug free, but it did mean that it worked to the extent that we had tested it - and as a complete system. It avoided situations where one section stops due to an update of another section. It just pulled it all together nicely, and it gives you a greater level of certainty that the unified system will work. Personally, I cannot understand why any CIO or CTO would not want to use CI.

    The more you are prepared, the less you need it.

  • Eric M Russell (7/16/2015)


    I'm not familaiar with unit testing frameworks; I typically do input / output unit testing in the development environment while I'm coding. The header for each stored procedure contains comments and same calls for the purpose of unit testing.

    Typically we perform integration testing for a unit of deliverable work by deploying to a stable QA envrironment that emulates production, and then we run injest files for that client end-to-end, from the initial drop folder all the way to the QA version of the data warehouse. A dashboard application monitors each step in the process, idenitfying failed steps or standard deviations in the dataset. The same process is repeated for a couple of other non-related clients that, because of their complexity and custom coding, are known to be sensitive to changes.

    Would you want to write some cases up? Show how you might test a few procs?

  • jckfla (7/17/2015)


    Just seems to me that rapid development/continuous delivery of software in the current era is more about revenue streams widening rather than usefulness of technology advancing rapidly.

    And as a tech person, that saddens me tremendously. I long to see innovation and advancement thrive once again and the tech market burst at the seams with new, exciting, and almost daily announcements of breakthroughs...rather than the shuffling of base functions of Windows configuration menus, and the changing of keystroke shortcuts (anyone else have trouble with Ctrl-Y not being Redo anymore? lol).

    I'm not sure I agree. When you're coding, how many times do you test what you've written? How many F5's a day?

    CI is running an F5 (compile), across all the code all 4 of your people might have written. Or it's an f5 of your changes, along with all the tests that exist for the codebase, not just the few you remember to run. It's an engineered way of compiling and building.

    The CD portion of this is it allows you to release a change whenever you need to. Not that you release every day, but if you needed to, you could.

  • jckfla (7/17/2015)


    Just seems to me that rapid development/continuous delivery of software in the current era is more about revenue streams widening rather than usefulness of technology advancing rapidly.

    And as a tech person, that saddens me tremendously. I long to see innovation and advancement thrive once again and the tech market burst at the seams with new, exciting, and almost daily announcements of breakthroughs...rather than the shuffling of base functions of Windows configuration menus, and the changing of keystroke shortcuts (anyone else have trouble with Ctrl-Y not being Redo anymore? lol).

    Working in retail you bet rapid releases are about driving revenue. Every thing an IT person does within their employment is supposed to drive their company forward it's just that software vendors products make life difficult for the person who is affected by the upgrade path.

    The question I would ask is whether we need an entire product upgrade and a major new release or whether component upgrade is more appropriate? Decouple SSxS upgrades from SQL Server but still have a default set for a given release.

  • Steve Jones - SSC Editor (7/17/2015)


    Eric M Russell (7/16/2015)


    I'm not familaiar with unit testing frameworks; I typically do input / output unit testing in the development environment while I'm coding. The header for each stored procedure contains comments and same calls for the purpose of unit testing.

    Typically we perform integration testing for a unit of deliverable work by deploying to a stable QA envrironment that emulates production, and then we run injest files for that client end-to-end, from the initial drop folder all the way to the QA version of the data warehouse. A dashboard application monitors each step in the process, idenitfying failed steps or standard deviations in the dataset. The same process is repeated for a couple of other non-related clients that, because of their complexity and custom coding, are known to be sensitive to changes.

    Would you want to write some cases up? Show how you might test a few procs?

    Yes, I'll write up an article on this topic.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Steve Jones - SSC Editor (7/17/2015)


    jckfla (7/17/2015)


    Just seems to me that rapid development/continuous delivery of software in the current era is more about revenue streams widening rather than usefulness of technology advancing rapidly.

    And as a tech person, that saddens me tremendously. I long to see innovation and advancement thrive once again and the tech market burst at the seams with new, exciting, and almost daily announcements of breakthroughs...rather than the shuffling of base functions of Windows configuration menus, and the changing of keystroke shortcuts (anyone else have trouble with Ctrl-Y not being Redo anymore? lol).

    I'm not sure I agree. When you're coding, how many times do you test what you've written? How many F5's a day?

    CI is running an F5 (compile), across all the code all 4 of your people might have written. Or it's an f5 of your changes, along with all the tests that exist for the codebase, not just the few you remember to run. It's an engineered way of compiling and building.

    The CD portion of this is it allows you to release a change whenever you need to. Not that you release every day, but if you needed to, you could.

    Maybe I'm not seeing something right about CI...but to me, it seems to me very loosely defined.

    I would think Continuous Integration would be related to what (at least during my time/experience/education) would be called a "formal build". That is, it's something that gets tested beyond the standard, informal testing within the group who writes it.

    Even though I work in a small group, we still do a code/functional review. That way, my changes aren't truly integrated until it's ensured that it meets more than just my idea of what the specification is I am to meet.

    But, maybe I'm not seeing things right. I am a bit old school. I might just need to be put out to pasture....and become a manager. 😎

  • David.Poole (7/18/2015)


    jckfla (7/17/2015)


    Just seems to me that rapid development/continuous delivery of software in the current era is more about revenue streams widening rather than usefulness of technology advancing rapidly.

    And as a tech person, that saddens me tremendously. I long to see innovation and advancement thrive once again and the tech market burst at the seams with new, exciting, and almost daily announcements of breakthroughs...rather than the shuffling of base functions of Windows configuration menus, and the changing of keystroke shortcuts (anyone else have trouble with Ctrl-Y not being Redo anymore? lol).

    Working in retail you bet rapid releases are about driving revenue. Every thing an IT person does within their employment is supposed to drive their company forward it's just that software vendors products make life difficult for the person who is affected by the upgrade path.

    The question I would ask is whether we need an entire product upgrade and a major new release or whether component upgrade is more appropriate? Decouple SSxS upgrades from SQL Server but still have a default set for a given release.

    Seems to me, a lot of the changes being pushed out so fast in tech now aren't doing anything significant. There are exceptions, but the vast majority are either shining the tech bauble or reorganizing it. Substance, in a lot of commercial software releases I see now, seems to be lacking.

    For instance, a lot of the "move to the cloud" that was supposed to be cost effective and time saving and all...has ended up being sluggish and undependable for a lot of folks. We've moved some of our things to cloud, and now a calendar update for our group takes upwards of 20 seconds...where it was just a month ago only 2-3 on local servers. So, I am not sure how inefficiency = cost savings.

    And yeah, upgrade paths are now becoming stupidly difficult. Especially the OS arena. I wish I were in my 20s now. The next 5 years are the prime time to look at innovating a new OS and tools for it. After that, I'm not sure what is going to happen...other than consumers and professionals having to "take what they're given".

    Like I said...I'd love to see more innovation. But, I don't see a lot of "forward" going on out there. That's the sad part.

  • jckfla (7/20/2015)


    Maybe I'm not seeing something right about CI...but to me, it seems to me very loosely defined.

    I would think Continuous Integration would be related to what (at least during my time/experience/education) would be called a "formal build". That is, it's something that gets tested beyond the standard, informal testing within the group who writes it.

    Even though I work in a small group, we still do a code/functional review. That way, my changes aren't truly integrated until it's ensured that it meets more than just my idea of what the specification is I am to meet.

    But, maybe I'm not seeing things right. I am a bit old school. I might just need to be put out to pasture....and become a manager. 😎

    Yes and no. Depending on the size of your group. Certainly it's automated tests that are run, and whether they are the set of tests a group is supposed to run or a sub/super set, CI ensures the tests are run. Developers, even in groups, do forget to run tests, especially on every build.

    However, this isn't some magic new thing that shouldn't be already happening. The difference here is that it's automated and run at the single check in level, at every check in.

Viewing 15 posts - 16 through 30 (of 43 total)

You must be logged in to reply to this topic. Login to reply