Language Selection

English French German Italian Portuguese Spanish

Encryption: the key to secure data?

Filed under
Security

For as long as modern computers have been around, they have been associated with encryption in one way or another. It is no coincidence that the first semi-programmable computer, Colossus, was developed to decrypt messages during the Second World War.

Encryption relies on encoding information in a way that makes it difficult to decode without either a key (cipher) or an awful lot of mathematical muscle. The longer the length of the cipher (in bits), the more difficult it will be to break. Although there are many encryption techniques that are unbreakable in practice, there are very few that are unbreakable in theory, given enough time or processing power.

Encryption techniques separate into two main types, explains Bernard Parsons, chief technology officer at security software company BeCrypt. Symmetric encryption dates back to the Roman empire and beyond, but asymmetric encryption is more recent.

Commonly used for file encryption, (for example, to protect files on a laptop to hide data in the event of theft), symmetric encryption focuses on using a single cipher to encrypt and decrypt data. "As the field of cryptography became better understood in the public domain, a number of algorithms were proposed, all of them based on difficult mathematical problems," says Parsons.

The trick is to ensure that the mathematical problem is sufficiently complex to stop it being solved by current computing technology. Developing such problems requires not only significant mathematical skill, but also an agreement between multiple parties to use the same mathematical algorithm to encrypt and decrypt data, in order to exchange files.

Consequently, standards became important in the early days of modern computerised encryption in the mid-1970s. One of the first was the Data Encryption Standard (DES), an encryption algorithm using a cipher 56 bits long. DES was at one time considered strong enough to be used for banks' automatic teller machines, but as processing power increased, it was replaced by triple DES, which ran the same piece of data through the DES algorithm three times for extra strength.

"Towards the end of the 1980s questions were asked about the appropriateness of triple DES for a number of reasons, one being performance," says Parsons. A new encryption standard called AES (Advanced Encryption Standard) was established in 2001, and it is still considered to be state-of-the-art.

Full Article.

More in Tux Machines

Phoronix offers some criticism of KDE software, and this is how KDE deals with it

About a month ago, Eric Griffith posted an article on Phoronix where he compared Fedora’s KDE spin to the main Fedora Workstation which uses GNOME. In that article, Eric described a number of issues that he became fully aware of when comparing his favorite desktop environment, Plasma (and the KDE applications he regularly uses) with GNOME’s counterparts. I read that article, shared it with other KDE designers and developers, and we came to the conclusion that yes, at least some of the issues he describes there are perfectly valid and clearly documented. And since KDE does listen to user feedback if it makes sense, we decided we should do something about it. Read more

Trinity Desktop Environment Now Supports Ubuntu 15.04, ARM64, and PPC64le

The developers behind the TDE (Trinity Desktop Environment) project, an open-source desktop environment that keep the spirit of KDE3.5 alive, have announced the immediate availability for download of Trinity Desktop Environment R14.0.1. Read more

Ubuntu MATE Donates Money to VLC, OpenBSD, and a Debian Developer

Martin Wimpress, the lead developer and maintainer of the Ubuntu MATE operating system, had the great pleasure of informing us about the contributions made to various open source projects during the month of August 2015. Read more

today's leftovers