Ten Million Lines of Code

  • Comments posted to this topic are about the item Ten Million Lines of Code

  • I don't think that Quickbooks should be held up as an example of quality software or quality software processes. I am really not surprised that they took ten million lines of code to do something that poorly.

  • Not only that, one wonders whether this CI meme will stall the dynamic language programming meme? Compiling doesn't do much good.

  • QuickBooks does things poorly in the same sense that capitalism is a bad economic system. It is, until you have to try anything else.

  • The advertisement which follows suggests that you can

    "....reduce mistakes with Contiuous Integration..." but apparently not spelling mistakes! (Continuous.)

  • I won't say it's great by any means, but I have used QuickBooks just fine in the past. The key thing here isn't necessarily quality though, the point is they use a validity method to determine the source code will compile and by doing it every few minutes they can catch errors sooner. This is great if your source is 10 million lines of code, but they still have to trace out the changes that caused the failure via the coders who developed said changes. That said do the coders not compile their own code to test the same thing before they submit, I mean I launch my code after almost every change I make to ensure compiles and there are no errors/warnings (I'm a bit of an obsessive on this), thus I am doing the exact same thing they are calling great, sure I am typically the only developer but on occasion I am not, and I do like to catch mistakes early, especially while me or the other developers still have the changes fresh in their heads. I don't think Inuit has cornered the market on this method nor does it make a product quality so I am not sure how this technique really is anything out of the norm and makes their practices better than others in large source sets. Hell, I just inherited a new app with lots of deprecated functionality that was never removed and have since I have no knowledge I have been refactoring the code and in many cases commenting large sections out just to see what errors I get. So far I have been able to remove about 12 classes, and the number of lines of code on most pages are typically 1/3rd - 1/8th the original size. Code does exactly the same thing, and in some cases runs faster, get's rid of a lot of unnecessary variables and calls that do nothing other than take up memory and streamline several functions that relied on try catch methods to make a decision to error safe checks. So I want to hear how to improve code by reducing bloat out of 10 million lines of code using some tool for auto refactoring that is amazing because it can find common functionality and refactor them into reusable objects reducing the lines of code and ensuring related processes stay synced in behavior.

  • ddriver (8/27/2012)


    I don't think that Quickbooks should be held up as an example of quality software or quality software processes. I am really not surprised that they took ten million lines of code to do something that poorly.

    Dynamic programming works well in some cases, but I'm not sure in all.

    CI doesn't interfere with this, especially with databases. It can still run a set of changes through tests, even if they are uncompiled. The idea is that you have a process that combines changes and does pre-testing for QA, reporting back bugs quickly.

  • Wow... 10M lines of code. I develop a .NET application that has 877+K lines of code spread out over 40 projects in a single solution. I thought that was pretty big...

  • Antares686 (8/27/2012)


    This is great if your source is 10 million lines of code, but they still have to trace out the changes that caused the failure via the coders who developed said changes. That said do the coders not compile their own code to test the same thing before they submit, I mean I launch my code after almost every change I make to ensure compiles and there are no errors/warnings (I'm a bit of an obsessive on this), thus I am doing the exact same thing they are calling great,

    [\quote]

    You're missing the change. If 25 developers make changes, how do you know they haven't caused each other problems? That's what this is designed to help find. There are plenty of developers, especially across staff changes, that might not recognize warnings as problematic. I'd hope they catch compile errors, thougt.

    I do like to catch mistakes early, especially while me or the other developers still have the changes fresh in their heads. I don't think Inuit has cornered the market on this method

    The idea here is to catch mistakes early and feed them back to the developers within 30min or so when things are fresh in their minds.

    This doesn't necessarily make better code. That all depends on the developers, but it does give you a way to reduce costs and potentially make better code by finding issues quickly. You could just get code out quicker, and it still be crap code.

  • Steve Jones - SSC Editor (8/25/2012)


    On the Windows platform, that system consists of 10mm lines of code ...

    No wonder they've got so many LoC if they restrict the length of each line to 1cm! 😛

  • CI is a God send in a large development shop where you will have a team of developers working on something.

    The idea is that a check-in to the source control system will prompt the CI environment to check out the latest code and do a build.

    A solution may build find on your local box but when checked into the main repository the combination of your work and your colleagues may not work exactly as expected. Maybe a colleague found a hole in some of the test coverage and plugged it correctly resulting in a build fail.

    One thing is for sure, if it doesn't build on your box it won't build in CI.

    If you want to get the most from CI you need meaningful tests designed to flush out bugs rather than a percentage test coverage to act as a tick in the box.

    Of course the next stage up from CI is continuous deployment to production. That is definitely one to run a DBAs hair white.

    Moving towards continuous deployment means that you have far smaller code releases. In the event of a bug getting through testing it is much easier to pin it down and fix it rather than wade through a montly or quarterly code release. In effect you are reducing (though obviously not eliminating) the risk of any deployment.

  • Regardless of the integration or compilation technique, 10 million lines of code may be an advertisement for how to do things badly, not well. Over a year ago, I walked into a new job (walked into a trap) where the previous incumbents (one of whom is my boss):

    :: didn't get along,

    :: had no formal training,

    :: thought they knew it all (so why bother learning better ways to do things)

    :: documented/commented nothing (sometimes, I think, deliberately, to make life hard for the other colleague)

    Their job (or so they thought) was to write SSRS reports. Their answer to every question was "we'll do you a report for that." Each report is just slabs of bad sql copy/pasted from other similar reports. Instead of writing reports with parameters, the parameter would be the report. So you'd have "Customers per day at store A" and "Customers per day at store B" as separate reports. Now there are THOUSANDS of reports in production, each with hundreds or thousands of lines of awful, unreadable SQL. And the truly galling thing is that they wear it as a badge of honour. "Look what we done!" Meanwhile myself and another poor soul have to pick up the pieces, and endure the daily abuse from unhappy customers and management. Ten million lines of code is of itself, nothing to be proud of.

    ...One of the symptoms of an approaching nervous breakdown is the belief that ones work is terribly important.... Bertrand Russell

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply