Language Selection

English French German Italian Portuguese Spanish

Taking a trip down memory-chip lane

Filed under
Sci/Tech

REMEMBER your first time, when you sat in front of a keyboard and monochrome screen and joined a brave new world? You may have been playing Pong or Manic Miner, or carefully crafting your first lines of code. But you won't have forgotten the joy of discovering personal computers.

It's time to revisit your youth, because BBC Bs, ZX81s, Spectrums and Commodores are cool again, part of a wave of computing nostalgia. Today's stylish PCs may perform billions of calculations a second and store tens of billions of bytes of data, but for many, they have got nothing on the 32, 48 or 64-kilobyte machines that were the giants of the early 1980s.

This renewed interest in old-school computing is more than just a trip down memory-chip lane. Early computers are a part of our technological heritage, and also offer a unique perspective on how today's machines work. And within growing collections of original computers and home-made replicas, and the anecdote-filled web pages and blogs devoted to them, lies the equipment and expertise that will one day help unlock our past by reading countless computer files stored in outmoded formats.

Enthusiasts say they are inspired by old machines not just because the computer era was ushered in by monumental developments in electronics, mathematics and information science but also because the digital computer changed the course of the 20th century. During the second world war one of the earliest electronic computers, Colossus, enabled Allied code breakers in the UK to decipher Nazi messages. In 1941 another of the earliest programmable machines, ENIAC, was used by the US army to calculate the trajectory of ballistic weapons with unprecedented accuracy. The rest, as they say, is history.

"They hark back to another time," says Hamish Carmichael, secretary of the UK's Computer Conservation Society, which works with the Science Museum in London to restore and rebuild classic machines. "And there's an element of detective work, in finding out how things were done originally." The society has helped the museum reconstruct the oldest working computer anywhere, an original Pegasus, made by British firm Ferranti in 1956. And it is working on an even older machine, an Elliot 403 dating from 1955.

What computers did for the military, they also did for the workplace, although the earliest models were a far cry from today's sleek laptops. The first commercial machine, UNIVAC I, was delivered to the US census bureau in 1951. Much larger than an SUV, it contained 2500 vacuum tubes and consumed 125 kilowatts of power, yet could perform just 1905 operations per second and store 1000 different characters.

Most enthusiasts, however, are more familiar with the computers that appeared in their homes during the 1970s and 1980s. The Altair 8800 is often credited with kick-starting the personal computer revolution. Sold in kit form in 1975, the 8800 consisted of several circuit boards slotted together inside a blue box the size of an old record player.

Programming the 8800 involved configuring several switches to correspond to a primitive command and then flicking another to store it in the computer's memory. The designers at Micro Instrumentation Telemetry Systems (MITS) only expected to sell a few hundred kits to keen electronics hobbyists. But the idea of owning a programmable "electronic brain" proved so irresistible that they received thousands of orders for kits within weeks of launch.

Build your own

Full Story.

More in Tux Machines

NHS open-source Spine 2 platform to go live next week

Last year, the NHS said open source would be a key feature of the new approach to healthcare IT. It hopes embracing open source will both cut the upfront costs of implementing new IT systems and take advantage of using the best brains from different areas of healthcare to develop collaborative solutions. Meyer said the Spine switchover team has “picked up the gauntlet around open-source software”. The HSCIC and BJSS have collaborated to build the core services of Spine 2, such as electronic prescriptions and care records, “in a series of iterative developments”. Read more

What the Linux Foundation Does for Linux

Jim Zemlin, the executive director of the Linux Foundation, talks about Linux a lot. During his keynote at the LinuxCon USA event here, Zemlin noted that it's often difficult for him to come up with new material for talking about the state of Linux at this point. Every year at LinuxCon, Zemlin delivers his State of Linux address, but this time he took a different approach. Zemlin detailed what he actually does and how the Linux Foundation works to advance the state of Linux. Fundamentally it's all about enabling the open source collaboration model for software development. "We are seeing a shift now where the majority of code in any product or service is going to be open source," Zemlin said. Zemlin added that open source is the new Pareto Principle for software development, where 80 percent of software code is open source. The nature of collaborative development itself has changed in recent years. For years the software collaboration was achieved mostly through standards organizations. Read more

Arch-based Linux distro KaOS 2014.08 is here with KDE 4.14.0

The Linux desktop community has reached a sad state. Ubuntu 14.04 was a disappointing release and Fedora is taking way too long between releases. Hell, OpenSUSE is an overall disaster. It is hard to recommend any Linux-based operating system beyond Mint. Even the popular KDE plasma environment and its associated programs are in a transition phase, moving from 4.x to 5.x. As exciting as KDE 5 may be, it is still not ready for prime-time; it is recommended to stay with 4 for now. Read more

diff -u: What's New in Kernel Development

One problem with Linux has been its implementation of system calls. As Andy Lutomirski pointed out recently, it's very messy. Even identifying which system calls were implemented for which architectures, he said, was very difficult, as was identifying the mapping between a call's name and its number, and mapping between call argument registers and system call arguments. Some user programs like strace and glibc needed to know this sort of information, but their way of gathering it together—although well accomplished—was very messy too. Read more