Reinventing the Internet

Filed under
Web

Part of the Internet was broken — for the 76th time that week.

Haugsness was on duty for the Internet Storm Center, the closest thing to a 911 emergency-response system for the global network.

Haugsness wrote up an alert and a suggested solution, and posted it on the Web.

Built by academics when everyone online was assumed to be a "good citizen," the Internet today is buckling under the weight of what is estimated to be nearly a billion diverse users surfing, racing and tripping all over the network.

Hackers, viruses, worms, spam, spyware and phishing sites have proliferated to the point where it's nearly impossible for most computer users to go online without falling victim to them.

Last year, the Carnegie Mellon University CERT Coordination Center logged 3,780 new computer-security vulnerabilities, compared with 1,090 in 2000 and 171 in 1995.

"I'm very pessimistic about it all," said Haugsness, who has worked for the storm center for two years. "There are huge problems and outages all the time, and I see things getting worse."

Now a movement to upgrade the network — to create an Internet 2.0 — is gathering steam. How, or even if, that could be done is a subject of debate. But experts are increasingly convinced that the Internet's potential will never be met unless it is reinvented.

"The Internet is stuck in the flower-power days of the '60s during which people thought the world would be beautiful if you are just nice," said Karl Auerbach, a former Cisco Systems computer scientist who volunteers with several engineering groups trying to improve the Internet.

Many of the bugs in the Internet are part of its top layers of software: the programs that come loaded on new computers or that are sold in retail stores. But some of the most-critical issues were built into the network's core design, written decades ago and invisible to the average user.

"The problem with the Internet is that anything you do with it now is worth a lot of money. It's not just about science anymore. It's about who gets to reap the rewards to bringing safe technologies to people," said Daniel Lynch, 63, who as an engineer at the Stanford Research Institute and the University of Southern California in the 1970s helped develop the Internet's framework.

As the number of users exploded to more than 429 million in 2000 from 45 million in 1995, Lynch remembered watching in horror as hackers defaced popular Web sites and shady marketers began to bombard people's e-mail inboxes with so much spam that real messages couldn't get through.

When the Internet's founders were designing the network in the 1960s and 1970s, they thought a lot about how the network would survive attacks from the outside — threats such as tornadoes, hurricanes, even nuclear war.

What they didn't spend much time thinking about was internal sabotage. Only several hundred people had access to the first version of the Internet, and most knew each other well.

"We were all pals," Lynch said. "So we just built it without security. And the darn thing got out of the barn."

Years passed before the Internet's founders realized what they had created. "All this was an experiment. We were trying to figure out whether this technology would work. We weren't anticipating this would become the telecommunications network of the 21st century."

Internet2, a consortium of mostly academic institutions that has built a network separate from the public Internet, is testing a technology that allows users to identify themselves as belonging to some sort of group.

"You've heard the saying that on the Internet nobody knows you're a dog, and that's of course the problem," said Douglas Van Houweling, president of Internet2 and a professor at the University of Michigan. "Authentication will allow communities to form where people are known and therefore can be trusted."

But there's a trade-off for such security: The network becomes balkanized, with more parts of it closed to most people.

Lynch believes the Internet will never truly be secure, though, because of the diversity of software and devices that run on it. If one has a flaw, others are vulnerable.

And so it is up to people like Haugsness at the Internet Storm Center to keep the network up and running.

Nothing "super bad" so far, Haugsness concluded of this one particular day. All in all, only about a half-dozen documented problems. That might have been considered a disaster a decade ago. But it was a pretty good day for the Internet in 2005.

Full Article.