Security Mysteries

  • Clue

    They're not SQL Server related, but they are some security issues that exist on the Internet today. Plus there's one from Oracle and it's hard to pass up the chance to give Oracle a hard time 🙂

    The five unsolved mysteries of security from Dark Reading is an interesting look at some problems on the Internet today. What's kind of funny is that most of these really aren't technical problems, meaning that we can write some code to correct them. They're mostly administrative or policy issues.

    ISPs targeting botnets? Big legal area to solve before they can even start writing code. Zero day exploits and paying for their discovery? Again, there are legal issues here. XSS vulnerabilities? There's lots of code here, but fundamental changes in development practices are needed for code really helps.

    And Oracle. I'm saving them for last, and for once I'm kind of on their side. I do think that it's irresponsible to disclose the actual bugs to the world without giving Oracle (or any other vendor) the chance to fix them. It takes them time and resources to work on the issues and they need to test it extensively. Not the kind of test where they let one million home users download it to their Linux box and if it blows up they can't play World of Warcraft or rip their neighbor's CDs. They need lab testing and extensive testing from customers in test environments.

    There does need to be some timeline attached to bugs, however. No vendor should be aware of issues that compromise the security of the system severely and let them sit there for months.

    I'm not sure of the compromise, but I'd like to see something along the lines of bugs reported to vendors and vendors being required to report some general data about the bug. Like it affects the SQL Native Client, it allows you to compromise the system without name and password, etc. I'm thinking some legal responsibility similar to that the SEC has in place for financial data. Then at least we'd know how long it takes to patch things.

    Security is a tough business and while I'm not sure we're losing, we're certainly not winning. I think it's a "give an inch, take an inch" longtime battle that we'll be facing the rest of our lives.

  • Among the largest issues in security is, and will be (for the foreseeable future anyway), the issue of responsibility for a bug in the software allowing a security breach. Unfortunately, the most common thing seen in the license files is "we think this works but if it doesn't then we are not responsible" and that goes for security considerations as well. As long as the courts allow this wording to stand, there will never be accountability in terms of software vendors. We can all go around spouting things like "good faith/bad faith" arguments in business practices but until the legal powers get rid of some of the wording in the license agreements things will not change in that regard.

    Hence, we are left with the largest issue in security... Security is the only area where if you do a good job you are more likely to lose your job. After all, if there haven't been any security breaches in a while then everything must be secure therefore what do we need security personnel for right? Of course, people rarely lose their job in such an instance but they do tend to lose their budget. This may cause tools to go out of date or the inability to buy updated hardware or maybe you just can't afford to keep that really good tech anymore. Regardless, this behavior in the industry leads to the most prominent vicious circle in the IT industry.

  • I agree with the need for public visibility as a motivation for vendors to make more secure products.

    Something like an independently-certified SOX-like requirement for public posting of the number, general description, and severity of security bugs by vendor, and how long they have been known. 

    If I was a CIO I would certainly like to see such a list by vendor so I know better how much risk I am taking on by going with Oracle vs. Microsoft vs. IBM etc.

  • Good idea, too bad it will take a major "meltdown" of the internet before the legal powers actually follow through on such a concept. Still, we can dream right?

    I am a DBA responsible for administering both MS SQL(2000 with 2005 coming soon) and Oracle (8i and 10g) and I have to say I am completely disgusted by the lack of reporting by Oracle on issues. Of course, some of that is due to MS being such a large target to so many. After all, half the people in the world want MS taken down because they are "the evil empire" and that forces MS to admit to faults. Another segment of people actively try to hack MS products and publish every thing they find as soon as they find it so again there is more pressure on MS to let things be known in that regard. Then I get to Oracle and there are numerous hacks pertaining to defeating the security measures in the database and application server products and yet absolutely no notice of such things in the known bug lists on OTN or technet or anywhere else.

    All things considered, I am very impressed by the lack of bugs/security breaches in MS SQL that have been reported while I am extremely disturbed by the number of bugs in Oracle and that is mostly because Oracle themselves will not "officially acknowledge" them and therefore they do not count. Go to any number of hack sites however, or un-official help boards, or online code libraries/repositories and you will find numerous ways around security regardless of how the server has been secured.

    Alas, it looks as if I must continue to dream...

  • Boy I'd hate to go down the list compiled by a governmental authority. They seem to mess everything up, but we certainly can't depend on private industry.

    Maybe we could tax everyone and let an independent authority like Carnegie Mellon handle reporting. Or a group of universities. The current situation doesn't work well for the people that count: the customers!

  • Well, maybe not a government agency but something that could be independently funded in some way might work like the Carnegie Mellon concept. Although, many universities have not impressed me with their timeliness either... Personally, I would rather see a fund going to one of the white-hat or gray-hat organizations that get paid by large corporations to do security testing. Their are a few companies that consult in such a manner and are literally paid to hack the existing installed base of applications but they are currently barred from letting any of the information they find be published. So, maybe if there was a consulting contract (ongoing) put in place for them to simply draw from so that they would not be under the thumb of an NDA or some such similar thing??? Of course, there would need to be two lists maintained; one for those holes that have gone 90 days without a fix and those that are still in "escrow" and the public would only have access to those outside of the escrow time period.

    Bottom line: I would feel better handing money to a group that does it for a living rather than a group that doesn't.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply