People still hear something about computers and repeat it mindlessly without understanding the world around them. One such example is the support for 32-bit architecture and the bundled application base. For them a home desktop performs better on any 32-bit OS and the native 32-bit applications than their 64-bit counterparts, and the user should restrict to it. It's true if that mythical user never goes beyond web browsers, word processors and media players. That's very much it. If he/she jumps into some database work, media encoding and some other number-crunching CPU-intensive task, the power of 64-bit shows, almost revolves circles around 32-bit thingy.
So, what's holding the 64-bit from replacing everything 32-bit?
rest here