Language Selection

English French German Italian Portuguese Spanish

Reinventing the Internet

Filed under
Web

Part of the Internet was broken — for the 76th time that week.

Haugsness was on duty for the Internet Storm Center, the closest thing to a 911 emergency-response system for the global network.

Haugsness wrote up an alert and a suggested solution, and posted it on the Web.

Built by academics when everyone online was assumed to be a "good citizen," the Internet today is buckling under the weight of what is estimated to be nearly a billion diverse users surfing, racing and tripping all over the network.

Hackers, viruses, worms, spam, spyware and phishing sites have proliferated to the point where it's nearly impossible for most computer users to go online without falling victim to them.

Last year, the Carnegie Mellon University CERT Coordination Center logged 3,780 new computer-security vulnerabilities, compared with 1,090 in 2000 and 171 in 1995.

"I'm very pessimistic about it all," said Haugsness, who has worked for the storm center for two years. "There are huge problems and outages all the time, and I see things getting worse."

Now a movement to upgrade the network — to create an Internet 2.0 — is gathering steam. How, or even if, that could be done is a subject of debate. But experts are increasingly convinced that the Internet's potential will never be met unless it is reinvented.

"The Internet is stuck in the flower-power days of the '60s during which people thought the world would be beautiful if you are just nice," said Karl Auerbach, a former Cisco Systems computer scientist who volunteers with several engineering groups trying to improve the Internet.

Many of the bugs in the Internet are part of its top layers of software: the programs that come loaded on new computers or that are sold in retail stores. But some of the most-critical issues were built into the network's core design, written decades ago and invisible to the average user.

"The problem with the Internet is that anything you do with it now is worth a lot of money. It's not just about science anymore. It's about who gets to reap the rewards to bringing safe technologies to people," said Daniel Lynch, 63, who as an engineer at the Stanford Research Institute and the University of Southern California in the 1970s helped develop the Internet's framework.

As the number of users exploded to more than 429 million in 2000 from 45 million in 1995, Lynch remembered watching in horror as hackers defaced popular Web sites and shady marketers began to bombard people's e-mail inboxes with so much spam that real messages couldn't get through.

When the Internet's founders were designing the network in the 1960s and 1970s, they thought a lot about how the network would survive attacks from the outside — threats such as tornadoes, hurricanes, even nuclear war.

What they didn't spend much time thinking about was internal sabotage. Only several hundred people had access to the first version of the Internet, and most knew each other well.

"We were all pals," Lynch said. "So we just built it without security. And the darn thing got out of the barn."

Years passed before the Internet's founders realized what they had created. "All this was an experiment. We were trying to figure out whether this technology would work. We weren't anticipating this would become the telecommunications network of the 21st century."

Internet2, a consortium of mostly academic institutions that has built a network separate from the public Internet, is testing a technology that allows users to identify themselves as belonging to some sort of group.

"You've heard the saying that on the Internet nobody knows you're a dog, and that's of course the problem," said Douglas Van Houweling, president of Internet2 and a professor at the University of Michigan. "Authentication will allow communities to form where people are known and therefore can be trusted."

But there's a trade-off for such security: The network becomes balkanized, with more parts of it closed to most people.

Lynch believes the Internet will never truly be secure, though, because of the diversity of software and devices that run on it. If one has a flaw, others are vulnerable.

And so it is up to people like Haugsness at the Internet Storm Center to keep the network up and running.

Nothing "super bad" so far, Haugsness concluded of this one particular day. All in all, only about a half-dozen documented problems. That might have been considered a disaster a decade ago. But it was a pretty good day for the Internet in 2005.

Full Article.

More in Tux Machines

BeagleBone Announces the Open Source PocketBeagle USB-Key-Fob SBC

  • BeagleBone Announces the Open Source PocketBeagle USB-Key-Fob SBC
    You've probably heard of BeagleBones and the Beagleboard Foundation by now (check out that link if you're not familiar with them). They make open source SBCs and have an online community much like the Raspberry Pi Foundation. While Beaglebones don't have as large of a community or market share as Raspberry Pi, their boards are still quite popular because they tend to be more application-focused than Raspberry Pis. For example, there's the general-purpose Beaglebone Black, the sensor-oriented Beaglebone Green, and the Beaglebone Blue for robotics applications.
  • What is PocketBeagle?

today's howtos

Graphics: NVIDIA, Nouveau, X.Org Server

  • NVIDIA Making Progress On Server-Side GLVND: Different Drivers For Different X Screens
    While NVIDIA isn't doing much to help out Nouveau, at least the company is contributing to the open-source Linux graphics ecosystem in other ways. In addition to presenting at XDC2017 this week on the Unix device memory allocator API and DeepColor / HDR support, they also presented on server-side GLVND. Server-side GLVND is separate from the client-side GLVND (OpenGL Vendor Neutral Dispatch Library) that evolved over the past few years and with modern Linux systems is supported both by Mesa and the NVIDIA binary driver. Server-side GLVND can help PRIME laptops and other use-cases like XWayland where potentially dealing with multiple GPU drivers touching X.
  • Nouveau Developers Remain Blocked By NVIDIA From Advancing Open-Source Driver
    Longtime Nouveau contributors Martin Peres and Karol Herbst presented at this week's XDC2017 X.Org conference at the Googleplex in Mountain View. It was a quick talk as they didn't have a whole lot to report on due to their open-source NVIDIA "Nouveau" driver efforts largely being restricted by NVIDIA Corp.
  • X.Org Server 1.20 Expected Around January With New Features
    X.Org Server 1.19 is already almsot one year old and while X.Org is currently well off its six month release cadence, version 1.20 is being figured out for an early 2018 release. Adam Jackson of Red Hat who has been serving as the xorg-server release manager held a quick session on Friday at XDC2017 to figure out what's needed for X.Org Server 1.20. His goal is to see X.Org Server 1.20 released in time for making the Fedora 28 version. For that to happen nicely, he's hoping to see xorg-server 1.20 released in January. The Fedora 28 beta freeze is the middle of March so there is still time for the 1.20 release to slip while making the F28 Linux distribution update.

ASUS Launches Its Thinnest and Lightest Flippable Chromebook, the Flip C101

ASUS announced a new Chromebook on its website, the Flip C101, which is a smaller and lightweight version of the C302 model. Featuring a 10.1-inch touchscreen display, the all-new Chromebook is priced at only $299 in the US. Read more