Computer security problems are outside the box -- in law and public policy, Cornell expert says

ST. LOUIS -- When a virus infects a computer or a hacker steals credit card numbers from an online retailer's Web site, programmers aren't the only ones at fault. Existing laws and public policy are also significant impediments to computer security, according to a Cornell University security expert.

"Changing the way ordinary people think about things is necessary for solving the cybersecurity problem and certainly would make today's systems far more trustworthy," said Fred Schneider, Cornell professor of computer science.

Although Schneider is a recognized leader in computer science research aimed at building more secure software, he focused on legal and economic aspects of computer security in a talk today (Feb. 18) at the annual meeting of the American Association for the Advancement of Science. His talk, "Non-Technical Impediments to Securing Cyberspace," was part of the symposium "Fostering International Collaborations in Information Security Research." Schneider' delivered the lecture by telephone after his flight was cancelled due to weather.

While asserting that "this is not a talk with a happy ending, as befitting where we are and where we are heading," Schneider suggested that keeping better track of actions performed in computing systems could make a big improvement in cybersecurity.

Technology is available today to make systems more secure, Schneider said, but software vendors and software purchasers lack incentives to invest, because it adds expense and sometimes requires removing desirable features. "There oughta be a law!" you might say. But, Schneider pointed out, having such a law would depend on being able to make measurements against security standards, and there is no good way today to measure system security. A new ladder, for example, comes with a label stating exactly how much weight it can support, but there is no practical way to get such a measure for the security of a computer system or even to prove that a system is secure, he said. It can be shown that a number of attempts to hack a system have failed or that a system was built following some process, but that is no guarantee that some soon-to-be-invented attacks won't succeed, Schneider noted.

Insurance, which is well suited for managing risk, fails in cybersecurity for similar reasons, he added. If something can't be locked up tight it can be insured, and if it is stolen, the insurance company pays for it. But insurance companies must reduce their own exposure by basing the prices they charge for coverage on historical data about payouts. When it comes to software, last year's experience is useless, Schneider asserted, because the attacks are always changing, and software is constantly changing -- even small changes can radically change the security profile of a system.

Social and cultural norms also contribute to the problem, he said. For instance, many businesses regard Social Security numbers as a form of password, even though they are not secret and are really just identifiers. "Since my Social Security number is not secret, I shouldn't be asked my Social Security number in order to prove I'm me," he explained. "Change that misuse of Social Security identifiers and stealing Social Security numbers would no longer be an attractive proposition for cyber-criminals. That's not a legal change or a technical change but a cultural change that would help improve the trustworthiness of our systems."

Schneider also challenged the now popular belief that, when it comes to computing, "the world is flat." Although electronic communication makes it possible to build and use software anywhere in the world, he said, laws or public policy can't be counted on to deal with security problems across borders. "Which country's laws apply?" he asked. "Which country is responsible for enforcement?"

But in those questions lies a possible answer. "Security needs to rest on accountability," he said. If you rob a bank, there's a good chance you will be caught, convicted and punished, not because the bank has a secure vault, but because it has video cameras that will identify you. If computer systems did a better job of auditing, he suggested, there would be a better basis on which to redress violations. "If every action could be traced back to the actor, it would be hard to inject a virus," he said. "We'd get back to you."

He concluded, "Once accountability exists in computing systems we could create the laws and bring about conviction and punishment."

Schneider is director of the Information Assurance Institute at Cornell and chief scientist for TRUST (Team for Research in Ubiquitous Secure Technology), a National Science Foundation Science and Technology Center devoted to computer security and reliability. He chaired the Information Systems Trustworthiness study by the National Research Council and the National Academy of Sciences, which resulted in the book "Trust in Cyberspace."

##

Media Contact

Blaine Friedlander