Language Selection

English French German Italian Portuguese Spanish

Encryption: the key to secure data?

Filed under

For as long as modern computers have been around, they have been associated with encryption in one way or another. It is no coincidence that the first semi-programmable computer, Colossus, was developed to decrypt messages during the Second World War.

Encryption relies on encoding information in a way that makes it difficult to decode without either a key (cipher) or an awful lot of mathematical muscle. The longer the length of the cipher (in bits), the more difficult it will be to break. Although there are many encryption techniques that are unbreakable in practice, there are very few that are unbreakable in theory, given enough time or processing power.

Encryption techniques separate into two main types, explains Bernard Parsons, chief technology officer at security software company BeCrypt. Symmetric encryption dates back to the Roman empire and beyond, but asymmetric encryption is more recent.

Commonly used for file encryption, (for example, to protect files on a laptop to hide data in the event of theft), symmetric encryption focuses on using a single cipher to encrypt and decrypt data. "As the field of cryptography became better understood in the public domain, a number of algorithms were proposed, all of them based on difficult mathematical problems," says Parsons.

The trick is to ensure that the mathematical problem is sufficiently complex to stop it being solved by current computing technology. Developing such problems requires not only significant mathematical skill, but also an agreement between multiple parties to use the same mathematical algorithm to encrypt and decrypt data, in order to exchange files.

Consequently, standards became important in the early days of modern computerised encryption in the mid-1970s. One of the first was the Data Encryption Standard (DES), an encryption algorithm using a cipher 56 bits long. DES was at one time considered strong enough to be used for banks' automatic teller machines, but as processing power increased, it was replaced by triple DES, which ran the same piece of data through the DES algorithm three times for extra strength.

"Towards the end of the 1980s questions were asked about the appropriateness of triple DES for a number of reasons, one being performance," says Parsons. A new encryption standard called AES (Advanced Encryption Standard) was established in 2001, and it is still considered to be state-of-the-art.

Full Article.

More in Tux Machines

Why Samsung's Open-Source Group Likes The LLVM Clang Compiler

Samsung is just one of many companies that has grown increasingly fond of the LLVM compiler infrastructure and Clang C/C++ front-end. Clang is in fact the default compiler for native applications on their Tizen platform, but they have a whole list of reasons why they like this compiler. Read more

Framing Free and Open Source Software

Having just passed its thirtieth birthday, the Free Software Foundation has plenty to celebrate. Having begun as a fringe movement, free and open source software has become the backbone of the Internet, transforming business as a side-effect. Yet for all is accomplishments, the one thing it has not done is capture the popular imagination. As a result, I find myself wondering how free and open source software might present itself in the next thirty years to overcome this problem. Read more

What is a good IDE for R on Linux

If you have ever done some statistics, it is possible that you have encountered the language R. If you have not, I really recommend this open source programming language which is tailored for statistics and data mining. Coming from a coding background, you might be thrown off a bit by the syntax, but hopefully you will get seduced by the speed of its vector operations. In short, try it. And to do so, what better way to start with an IDE? R being a cross platform language, there are a bunch of good IDEs which make data analysis in R far more pleasurable. If you are very attached to a particular editor, there are also some very good plugins to turn that editor into a fully-fledged R IDE. Read more

Create your own desktop environment

What’s the best thing about Linux? Security, stability, performance or freedom? It does a cracking job in all of those areas, but another feature we’d highlight is its modularity. As an operating system deeply influenced by Unix, GNU/Linux is designed to be easy to pull apart – and, all being well, easy to put back together again. Major parts of the system are built up from smaller components that can be omitted or replaced, which is one of the reasons why we have so many different Linux distributions. Sure, this modularity adds complexity at times. But it also adds reliability, as components are designed to work independently, and if one crashes or suffers from some kind of bug, the other parts will (ideally) keep chugging along. So you can replace Bash with another shell, or switch to an alternative SSL library, or even replace your entire init system – as we’ve seen with the migration of major distros to Systemd. Read more