Open is Not Necessarily More Secure

  • Comments posted to this topic are about the item Open is Not Necessarily More Secure

  • I totally agree with the editorial. FOSS was purported by many as the next silver bullet for secure code. They were about as right as all previous and subsequent purveyors of silver bullets.

    Software takes effort.

    All those people claiming that something will fix a huge number of complex issues are naive. Or worse.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

  • The glibc vulnerability was discovered by engineers at Google. I think they have an interest in repairing the bug.

  • Good observations! FOSS is no more or less secure than any other software. The maxim "given enough eyeballs, all bugs are shallow" is true IFF there are lots of eyeballs on the code. For older libraries (like GLIBC) there's an unfortunate assumption that "it must be good since it's been running so long without reported errors". The fact is no white hat was looking for issues.

    The other side of it is of course that proprietary code is no more or less secure than FOSS, for the same reasons.

    However, at least with FOSS the bug-fix is completely open -- from reporting through to fixing -- not hidden in an SP where the issue is finally fixed, sometimes *years* after it was reported.

    Gerald Britton, Pluralsight courses

  • Also, let's not forget the HeartBleed vulnerability discovered a few years back in OpenSSL.

    http://www.pcworld.com/article/2141740/is-open-source-to-blame-for-the-heartbleed-bug.html

    Open Source means that the public can review the code to verify that the developers are not intentionally attempting to peddle malware and following good security practices, but it also provides a blue print for anyone looking to exploit an unintentional vulnerability.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • null

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Routers are very problematic because not all of the code is in firmware, some is in ROM which means not all of it can be patched. This also exemplifies the big problem with the Internet Of Things. So many IoT devices cannot be patched that they're just security problems waiting to happen (a co-worker's husband just ran in to his first hacked wireless router, he previously didn't know there was malware for routers). In the case of the Nest smoke alarms have microphones and are connected to the internet, I have no idea if they're patchable. And you're supposed to put them in or near bedrooms.

    In the case of routers, for me it hasn't been a problem: it's rare that they last more than a year, so they get replaced regularly. It didn't matter how much they cost, they lasted about a year. This is even being plugged in to a UPS with our internet coming in through fiber, so theoretically they are electrically secure. A couple of months ago I bought an Apple Airport, we'll see over time how it does.

    The other problem with patching IoT devices and routers is there's little money in it to incentivize the makers. They release new models every year and the router you bought one or two years ago doesn't represent any continuing money for them, so why spend the time and money developing patches and regression testing them? The glibc vulnerability is going to be tough as there are so many implementations in ROM.

    At least DLink will send you an email when new firmware is available if you bother registering.

    I'd like to see IoT should be changed to the Interesting Devices of Internetworked Optional Technology so the acronym would remind us of how secure they likely are. The internet was designed in the '60s by people who were optimistic about human nature and sharing information with little thought given to security, and now we suffer the consequences.

    But I am just a luddite (with a household containing three MacBooks, two iPhones, two iPads, a 27" iMac, Apple TV and Airport) who has no problem with turning on light switches manually and with his refrigerator not telling him that he's low on milk.

    😀

    -----
    [font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]

  • Yes, FOSS is no better than hidden source software, and also no worse than it, from a security point of view. And the reluctance of manufacturers to apply fixes, on enable users to apply fixes, is a long-standing problem both for open source and for secret source software.

    A long time ago there was a communications product (not a router, but a comms hardware and software product) which incorporated SQL Server Desktop Engine (MSDE 2000) and the creator (and sole supplier) of this product had engineered it so that users couldn't apply fixes to the MSDE code and the licence forbade users to do it in case they could work out a way of doing it. When the blaster worm hit the world that supplier refused to fix it, the only option was to buy an upgraded version. Unfortunately most of their customers accepted this disgraceful behaviour; the outfit I worked for didn't - that product ws removed from the list that could be offered to our customers, even its new release that fixed the bug, and all that supplier's routers, switches and firewalls were also banned from future use for our customers; our existing customers had the problem fixed somehow without paying that supplier a penny.

    If software licenses were sensibly regulated to require reasonable quality including some sort or warrant on functionality and on repair everything would soon get a lot more secure. More customers adopting the attitude we did at Neos might have that effect too.

    Tom

  • Free and Open source software allow you to take action on a flaw without being dependent on the vendor.

    412-977-3526 call/text

  • robert.sterbal 56890 (3/3/2016)


    Free and Open source software allow you to take action on a flaw without being dependent on the vendor.

    Absolutely true as a hypothesis.

    Not very true in reality. Most people aren't qualified, nor understand the complexity of something like glibc, openssh, linux kernel, etc.

    People do bring patches out, but you now have to ask yourself, do I trust the patch? Is this person leaving a backdoor in there? What is the US Government (or Brazil, China, take your pick) release a patch because something is critical. Do you take it?

    It's sad that we are moving to a place where it's hard to know where/how/when to patch software.

  • Notice that I didn't say it is more secure. But perhaps that shouldn't be the only thing we think about when evaluating security?

    Do we have a good breach plan? What is at stake if we aren't secure? What is the cost of implementing the security? What is the baseline measure of the security?

    412-977-3526 call/text

  • robert.sterbal 56890 (3/3/2016)


    Notice that I didn't say it is more secure. But perhaps that shouldn't be the only thing we think about when evaluating security?

    Do we have a good breach plan? What is at stake if we aren't secure? What is the cost of implementing the security? What is the baseline measure of the security?

    I would like to work with the maximum security that can be acquired for the same cost as the minimum acceptable security, because cost is important too. But what is the minimum acceptable security is the first question we have to answer for any system or app or product or .... and that minimum will depend on how long the working life of the product is and what may be needed in that timespan as well as what's needed right now today. The breach plan and what is at stake are factors in determining the minmum acceptable level, but they don't define the level any more than the cost does. The baseline may be the same sort of thing as the stake and the breach plan or it may not, depending on what "baseline" means.

    I don't think FOSS versus closed source makes any difference to this, because it usually doesn't change the cost enough to matter - the cost is dominated by the cost of fixing security problems and upgrading security-related code while maintining compatability with requirements other than security, so dominated by cost of ongoing ownership and not by cost of acquisition.

    Incidentally, I'm not sure whether I'm agreeing with you or disagreeing with you when I say all that - perhaps because the security game is not one which has been properly worked out yet so we are all a bit in the dark.

    Tom

  • robert.sterbal 56890 (3/3/2016)


    Free and Open source software allow you to take action on a flaw without being dependent on the vendor.

    And hacking closed source software allows you to take action on a flaw without being dependent on the vendor. 😀

    The FOSS case assumes you have the capability; if you have people that skilled, the CSS case is probably easy too - the only issue is getting caught, and very few vendors would risk taking you to court for fixing a serious security issue in your copy of their software if they were failing to fix it (depending on the local legal system, of course - don't try it in East Texas :angry:).

    Tom

  • Way back in the 1990s I worked with a guy who used to say that the more expensive the software the more the bugs. In general I have found that maxim to be correct.

  • David.Poole (3/4/2016)


    Way back in the 1990s I worked with a guy who used to say that the more expensive the software the more the bugs. In general I have found that maxim to be correct.

    I hope that doesn't mean that Microsoft's new pricing policy means SQL Server will have more bugs!

    Tom

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply