Language Selection

English French German Italian Portuguese Spanish

Calls to end US domination of the internet

Filed under

WHENEVER you surf the web, send emails or download music, an unseen force is at work in the background, making sure you connect to the sites, inboxes and databases you want. The name of this brooding presence? The US government.

Some 35 years after the US military invented the internet, the US Department of Commerce retains overall control of the master computers that direct traffic to and from every web and email address on the planet.

But a group convened by the UN last week to thrash out the future of the net is calling for an end to US domination of the net, proposing that instead a multinational forum of governments, companies and civilian organisations is created to run it.

The UN's Working Group on Internet Governance (WGIG) says US control hinders many developments that might improve it. These range from efforts to give the developing world more affordable net access to coming up with globally agreed and enforceable measures to boost net privacy and fight cybercrime.

US control also means that any changes to the way the net works, including the addition of new domain names such as .mobi for cellphone-accessed sites, have to be agreed by the US, whatever experts in the rest of the world think. The flipside is that the US could make changes without the agreement of the rest of the world.

In a report issued in Geneva in Switzerland on 14 July, the WGIG seeks to overcome US hegemony. "The internet should be run multilaterally, transparently and democratically. And it must involve all stakeholders," says Markus Kummer, a Swiss diplomat who is executive coordinator of the WGIG.

So why is the internet's overarching technology run by the US? The reason is that the net was developed there in the late 1960s by the Pentagon's Advanced Research Projects Agency (ARPA) in a bid to create a communications medium that would still work if a Soviet nuclear strike took out whole chunks of the network. This medium would send data from node to node in self-addressed "packets" that could take any route they liked around the network, avoiding any damaged parts.

Today the internet has 13 vast computers dotted around the world that translate text-based email and web addresses into numerical internet protocol (IP) node addresses that computers understand. In effect a massive look-up table, the 13 computers are collectively known as the Domain Name System (DNS). But the DNS master computer, called the master root server, is based in the US and is ultimately controlled by the Department of Commerce. Because the data it contains is propagated to all the other DNS servers around the world, access to the master root server file is a political hot potato.

Currently, only the US can make changes to that master file. And that has some WGIG members very worried indeed.

Full Article.

two words

Fuck that.

If they think the US will hand over control after inventing the 'Net in the first place, they're even dumber then most of the politicians in America.

Let them build their own damn Internet.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

More in Tux Machines

Lumina Desktop 1.1 Released

The BSD-focused, Qt-powered Lumina Desktop Environment is out with its version 1.1 update. The developers behind the Lumina Desktop Environment consider it a "significant update" with both new and reworked utilities, infrastructure improvements, and other enhancements. Lumina 1.1 adds a pure Qt5 calculator, text editor improvements, the file manager has been completely overhauled, system application list management is much improved, and there is a range of other improvements. Read more

Radeon vs. Nouveau Open-Source Drivers On Mesa Git + Linux 4.9

For your viewing pleasure this Friday are some open-source AMD vs. NVIDIA numbers when using the latest open-source code on each side. Linux 4.9-rc1 was used while Ubuntu 16.10 paired with the Padoka PPA led to Mesa Git as of earlier this week plus LLVM 4.0 SVN. As covered recently, there are no Nouveau driver changes for Linux 4.9 while we had hoped the boost patches would land. Thus the re-clocking is still quite poor for this open-source NVIDIA driver stack. For the Nouveau tests I manually re-clocked each graphics card to the highest performance state (0f) after first re-clocking the cards to the 0a performance state for helping some of the GPUs that otherwise fail with memory re-clocking at 0f, as Nouveau developers have expressed this is the preferred approach for testing. Read more

Ubuntu MATE, Not Just a Whim

I've stated for years how much I dislike Ubuntu's Unity interface. Yes, it's become more polished through the years, but it's just not an interface that thinks the same way I do. That's likely because I'm old and inflexible, but nevertheless, I've done everything I could to avoid using Unity, which usually means switching to Xubuntu. I actually really like Xubuntu, and the Xfce interface is close enough to the GNOME 2 look, that I hardly miss the way my laptop used to look before Unity. I wasn't alone in my disdain for Ubuntu's flagship desktop manager switch, and many folks either switched to Xubuntu or moved to another Debian/Ubuntu-based distro like Linux Mint. The MATE desktop started as a hack, in fact, because GNOME 3 and Unity were such drastic changes. I never really got into MATE, however, because I thought it was going to be nothing more than a hack and eventually would be unusable due to old GNOME 2 libraries phasing out and so forth. Read more

EU-Fossa project submits results of code audits

The European Commission’s ‘EU Free and Open Source Software Auditing’ project (EU-Fossa) has sent its code review results to the developers of Apache HTTP server target and KeePass. The audit results are not yet made public, however, no critical vulnerabilities were found. Read more