More Regulation Coming?

  • Comments posted to this topic are about the item More Regulation Coming?

  • I think that data breaches should be reported publicly as soon as they happen. That way customers can take early steps to protect themselves.

    From the news reports, Citibank sat on this data breach for a month before reporting it. That's too long!

  • This is a quite a subject regulations, but to make it short, as long as the wallet is not involved mankind do not move. Why should they (risk of a security issue cost disclosure versus the cost of implementing what's necessary to prevent it)? They still make money. It was, it is and it will be that way.

    And restricting the "wallet" is not also the solution. Who's going to pay in the end?

  • Regulations regarding technology are always behind the curve. It takes a long time to get regulations passed through the system and technology changes dramatically.

    A simple monetray penalty paid to each customer whose data was breached may help. Say, $100,000 per account? $1,000,000? Money talks and it may take large penalties to make companies pay attention to security.

  • Good editorial, but you must realize and ultimately accept the absolute truth, proven time and time again throughout history - Anything that can be built, can be un-built.

    If you waste your time in the panacea that somehow, some Wizard is going to come up with something that is so secure and yet accessible to those who need it, you are kidding yourself.

    Think years back to Oracle 9i. Larry Ellison released 9i and touted it as "unbreakable". In less than 24 hours it had been broken. What did Ellison do? Issued a fix and charged people for it (good lesson in how to get wildly rich, but...)

    Think about a different approach - How many times have you gone to your office during off-hours, broken into the front door, jimmied the elevators, used an axe to break down your company's office door, and then stolen a box of paperclips. (I hope the answer is "none").

    Why dont you do that? Because you would likely wind up behind bars. And THAT is the answer. Make it SO painful for hackers that it isn't worth the risk.

    Think about it - there used to be a company called Arthur Andersen. During the Enron debacle they lied, shredded documents and a number of their staff were caught, sent to jail, slapped with huge fines, and it all brought down the company (which re-emerged later as Accenture). But they don't do the "Enron" shuffle anymore. They learned a lesson.

    Catch a few hackers, put them away for a very long time, and make it as public as possible. Do that and you would see a huge drop in hacking.

    Whereas sitting around waiting for some unbreakable piece of code only inspires hackers to show you just how breakable ANY code is.

    There's no such thing as dumb questions, only poorly thought-out answers...
  • We talk as if we know what security is, and this is a grave mistake. Sure, for any given problem we know how to secure against it.

    The problem is, security *isn't* one problem, it's a googleplex of problems, each feeding on another to spawn millions of new ones.

    There are certain broad practices (like encrypting passwords, Sony I'm looking at you!) but by and large a program is an unprovable mathematical construct with an astronomical number of possible code paths.

    The problem is we're trying to secure against the "unknowable unknowns". We can handle the known problems, and even the known unknown problems, it's the unknown unknown problems that new hacks are made of.

    And you will *NEVER* secure against those.

    Having said that, most hacks are incredibly lame, and yes, we should have better solutions against those. Of course it would help if SQL Server was less mind-numbingly complex...

  • I doubt more regulations would do any good.

    Any new regulations are likely to be written in such a way as to increase the profits of various vendors selling security consulting services and software with very little impact on tha actual problem (SOX anyone?). Kind of like welfare for Accenture.

  • I think that more regulation is needed. There are specific software and database design patterns that for decades have been known to be security vulnerabilities, and yet they continue to be repeated. How is it possible that the website for one of the largest banks in the US could be hacked simply by tampering with the browser URL?

    Damn, this is 2011 not 1995, are we still developing data access frameworks for websites from scratch without following a standard design pattern? It's time we stopped treating Information Technology as it it were some magical realm that can't be regulated like other industries. For example, building codes specify how plumbing should be installed and what type of pipe materials are allowed. Thank you. The FDA bans certain medical procedures that proven ineffective and high risk. Thank you again.

    Citibank hacked. By changing account numbers. In the URL -

    Once inside, they leapfrogged between the accounts of different Citi customers by inserting various account numbers into a string of text located in the browser's address bar...

    http://channel9.msdn.com/Forums/Coffeehouse/Citibank-hacked-By-changing-account-numbers-In-the-URL

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Citibank outsourced their IT department to India. Poor coding and poor testing results in hacking.

    They deserve what they get.

    My advise is not to do any business with Citibank.

  • I agree that regulation directing the how would not be effective. What is needed is to make data security a personal priority for CEO’s and Boards of Directors. When they have a personal interest in good data security the necessary resources will be provided to those that can actually do something about it. Maybe 10 minutes in jail for each account/record lost would be incentive enough. As someone else said money is what moves people. So maybe a $1,000 per record fine would work. Until it is cheaper to do business right than to do it wrong, it will be done wrong.

    Ray R

  • OCTom (6/22/2011)


    Regulations regarding technology are always behind the curve. It takes a long time to get regulations passed through the system and technology changes dramatically.

    A simple monetray penalty paid to each customer whose data was breached may help. Say, $100,000 per account? $1,000,000? Money talks and it may take large penalties to make companies pay attention to security.

    First, the government gets the money, not the victims. How horrible to think that a company should be responsible for its actions, or lack thereof. What are you, a communist!

    Seriously, it is completely inappropriate that a company is not required to pay fines to those affected. It pays them to the government, because it is easy to buy off politicians. If they had to be responsible to those they harmed, they would end up paying actual money instead of buying votes. No benefit to the company, and none for politicians, means it won't happen until we rid ourselves of our current elected government crooks. All of them.

    Second, and related to above, companies don't care about security. They don't care about their customers. They only care about money, and how to make more. A number of people I work with received the usual top rating on our reviews - and got a pittance of a raise. Executives still continue to get huge raises and bonuses. As the IT job market improves, people are starting to look elsewhere for work. They are finding a few companies that have learned they must pay for quality, but that usually doesn't last. Some may pay more temporarily until they feel they have what they need, but not one day longer. Given this situation, and that the best people generally are the first to go, how can companies expect to secure their data? They put no effort into it, don't really have to pay fines in most cases, and are not held responsible to those they harm.

    This won't change until we fix quite a few things. It is possible to fix it, but it won't happen as long as the individual has no say in our government, and companies, lobbies, special interest groups, and unions control those who we allegedly elect.

    Dave

  • Loner (6/22/2011)


    Citibank outsourced their IT department to India. Poor coding and poor testing results in hacking.

    They deserve what they get.

    My advise is not to do any business with Citibank.

    One other possible fix. Consumers can make a difference. The issue is it requires everyone to close their account with that poor excuse for a bank.

    Dave

  • blandry (6/22/2011)


    Why dont you do that? Because you would likely wind up behind bars. And THAT is the answer. Make it SO painful for hackers that it isn't worth the risk.

    What more would you recomend? Instant death before trial? There are some high profile hacker cases were the defendant has been held without bail in a jail cell for 3 years before seeing a court room. Thanks to the domestic terrorism laws this is actually legal.

    It would be more effective to make it so painfull for business to store data on impropery secured systems that they stop doing it.

    If mistakes are not painfull they are repeated.

    The last 30 years of American history is stained with incidents caused by poor regulation. Business just does not care about risk if something works and it makes money.

    Sub-prime lending was stupid and everyone knew it. Unfortunately it made so much money for so little effort that it had to be ok. Nobody was regulating it so it had to be right. Currently our economy has been permanently damaged because we loaned out 200% of our money and sold all the intrest to out of country banks.

    If there is no penalty for being stupid, and being stupid costs less (or makes more money) than being smart, Businesses will cut cost and be stupid every time.

    History is riddled with era destroying events that where not caused by the thief, they where caused by the gaurdians and greed.

  • As a result of concerns about risks inherent in the ever-intertwined computer network on which our society is driven, I've recently proposed to colleagues that the IT function should create a mechanism by which companies would (directly or indirectly) have third-party auditors provide a security audit which could then be made available to companies that were interested in being connected to the audited firm's computer network. The idea is similar to the audit function in the accounting world (which, I agree, has problems, mostly related to government regulation). The purpose would provide multiple benefits at, admittedly, a currently unknown cost:

    1) Provide a way for inter-connecting companies to more objectively evaluate the technical risks of those relationships

    2) Provide the audited company with an objective evaluation of their network-related security

    3) Standardize security audit, testing, and evaluation, with possibility of industry-specific features

    4) Hopefully delay government intervention via over-regulation of computer network security (potentially industry-specific)

    Obviously, non-disclosure agreements would need to cover any security-related disclosures between companies. The industry would also need to manage auditors' tendencies to prefer certain technologies over other technologies. This could be discouraged via industry-wide feedback (e.g. an IEEE forum) and a high-level framework under which audits would occur.

    Note that I'm neither a security specialist nor an IT consultant so I don't have a specific technological interest in this issue. My interest is in the safety of corporate and public assets for the good of industry and our country. While I am very concerned about recent tendencies in our society to give up freedom to "improve" safety/security, I think this particular issue is one on which we need to take action.

    I can only hope that some who have a more technical background can drive such a discussion forward. Thanks for considering my ideas.

  • OCTom (6/22/2011)


    A simple monetray penalty paid to each customer whose data was breached may help. Say, $100,000 per account? $1,000,000? Money talks and it may take large penalties to make companies pay attention to security.

    That was my initial though too. However, that will also increase the incentive to bury the fact that a breach occurred. With 200,000 accounts breached that means a major expense if a breach is made public. That can be mitigated by having additional penalties if a breach is made public by someone outside the organization or if it's held secret over a certain length of time but since most hacks are made public by the organization being hacked I'm not sure how much good that would do.

    It'll also be hard to come up with a fair amount. Depending on the organization hacked and what data was obtained the repercussions to those whose data was stolen can vary. I think if there is a fee it should take that into account. But again, that could result in less than full disclosure.

Viewing 15 posts - 1 through 15 (of 46 total)

You must be logged in to reply to this topic. Login to reply