Language Selection

English French German Italian Portuguese Spanish

Encryption: the key to secure data?

Filed under
Security

For as long as modern computers have been around, they have been associated with encryption in one way or another. It is no coincidence that the first semi-programmable computer, Colossus, was developed to decrypt messages during the Second World War.

Encryption relies on encoding information in a way that makes it difficult to decode without either a key (cipher) or an awful lot of mathematical muscle. The longer the length of the cipher (in bits), the more difficult it will be to break. Although there are many encryption techniques that are unbreakable in practice, there are very few that are unbreakable in theory, given enough time or processing power.

Encryption techniques separate into two main types, explains Bernard Parsons, chief technology officer at security software company BeCrypt. Symmetric encryption dates back to the Roman empire and beyond, but asymmetric encryption is more recent.

Commonly used for file encryption, (for example, to protect files on a laptop to hide data in the event of theft), symmetric encryption focuses on using a single cipher to encrypt and decrypt data. "As the field of cryptography became better understood in the public domain, a number of algorithms were proposed, all of them based on difficult mathematical problems," says Parsons.

The trick is to ensure that the mathematical problem is sufficiently complex to stop it being solved by current computing technology. Developing such problems requires not only significant mathematical skill, but also an agreement between multiple parties to use the same mathematical algorithm to encrypt and decrypt data, in order to exchange files.

Consequently, standards became important in the early days of modern computerised encryption in the mid-1970s. One of the first was the Data Encryption Standard (DES), an encryption algorithm using a cipher 56 bits long. DES was at one time considered strong enough to be used for banks' automatic teller machines, but as processing power increased, it was replaced by triple DES, which ran the same piece of data through the DES algorithm three times for extra strength.

"Towards the end of the 1980s questions were asked about the appropriateness of triple DES for a number of reasons, one being performance," says Parsons. A new encryption standard called AES (Advanced Encryption Standard) was established in 2001, and it is still considered to be state-of-the-art.

Full Article.

More in Tux Machines

KDE Applications 14.12 - New Features, Frameworks Ports

Today KDE released KDE Applications 14.12, delivering new features and bug fixes to more than a hundred applications. Most of these applications are based on KDE Development Platform 4 but the first applications have been ported to KDE Frameworks 5. Frameworks is a set of modularized libraries providing additional functionality for Qt5, the latest version of the popular Qt cross-platform application framework. KDE app dragons This release marks the beginning of a new style of releases replacing the threesome of KDE Workspaces, Platform and Applications in the 4 series which ended with the latest KDE Applications update last month. Read more

What To Expect In 2015: Robots Join The Open-Source Revolution

The number of downloads doubled in 2014, to 3.5 million, and Gerkey expects adoption to spike again with the release of ROS 2.0 this summer. The upgrade will coordinate swarms, improve walking, and support smart sensors—basically, assimilate the world’s robots. Read more

New Input Drivers Coming For Linux 3.19 Kernel

One of the latest pull requests for the Linux 3.19 kernel is the input driver subsystem pull, which includes numerous updates along with a few new drivers. The new drivers will benefit some Google Chromebooks in running the latest upstream kernel. Read more

Docker and the Linux container ecosystem

Linux container technology is experiencing tremendous momentum in 2014. The ability to create multiple lightweight, self-contained execution environments on the same Linux host simplifies application deployment and management. By improving collaboration between developers and system administrators, container technology encourages a DevOps culture of continuous deployment and hyperscale, which is essential to meet current user demands for mobility, application availability, and performance. Read more