Language Selection

English French German Italian Portuguese Spanish

Unhealthy Partnerships Plague the Graphics Industry

Filed under
Hardware

If I were to buy a graphics card today, I would have little choice but to go with either ATI or NVIDIA. To break down my buying decision, I would probably consider the card's features, the novel (and pointless in certain cases) concept of future proofing, price and performance in games that I want to play. Everything sounds so clear-cut, doesn't it? I hate to be the one to break the news to you, but picking a graphics card is not an easy task these days. I can probably list a few reasons as to why my thinking runs that way, but chances are you are already aware of them. Therefore, I am going to focus this column on special relationships between the game developers and the graphics card makers.

The thing with today's graphics cards from both manufacturers is that they will perform quite well in latest games; so going with either brand isn't a tough choice anymore. For many of you, either NVIDIA or ATI will do just fine, but those of us who want to play games like Doom III or Half-Life 2, the companies have just made our lives more complicated than they already are.

For arguments sake, say you are building a computer. You know you will play Doom III extensively and maybe even think about Half-Life 2 in the future. In order to make up your mind, you will probably visit a few online publications to get a feel for performance differences between ATI and NVIDIA cards in the aforementioned titles. The problem is that when you look at the various performance numbers, the performance crown can go either way. And quite logically, a faster ATI card will outperform a slower NVIDIA card and vice versa, but things completely take turn for the worse when it comes to performance between Doom III and Half-Life 2. Shockingly, performance numbers between the two cards may be completely upside down. In certain scenarios, a mid-end NVIDIA card will beat ATI's high-end card in Doom III. The same thing happens in Half-Life 2; the only difference is that the winner is ATI and NVIDIA takes the backseat.

As long as the competition is healthy and consumers are benefiting from it, I think all this is perfectly fine. When you and I are feeling hassled by these wacky performance results, for which the card manufacturers and game developers share blame, then all of this gets quite irritating. After all, performance testing is supposed to help us make up our minds, not confuse us with upside down results in certain titles.

Now, lets say your goal is to play Half-Life 2 and Doom III. Which card should you go for?

Continued

More in Tux Machines

That Peculiar Linux 3.18 Kernel Bug Might Be Closed Soon

For the past month there's been kernel developers investigating "a big unknown worry in a regression" that have left many key kernel developers -- including Linus Torvalds -- puzzled. It looks like that investigation is finally being close to being resolved. Read more

New Releases

Notifications Without User Interaction on Ubuntu Are Annoying

The Unity desktop environment has a simple and rather ineffective system notification mechanism and it looks like that's not going to change, not even with the arrival of Unity 8. Read more

Librem Linux Laptop Drops NVIDIA Graphics But Still Coming Up Short Of Goal

One of the oddest things I found about the crowd-funded Librem 15 laptop when writing about it last month was that it wanted to be open-source down to the component firmware/microcode yet they opted to ship with a NVIDIA GPU. In an updated earlier this month, at least they came to their senses and dropped the discrete NVIDIA GPU. While I have no problems recommending NVIDIA graphics for Linux gamers and those wanting the best performance, that's only when using the proprietary drivers, and certainly wouldn't recommend it for a fully open-source system -- NVIDIA on the desktop side doesn't do much for the open-source drivers, let alone down to the firmware/microcode level. Instead the Librem folks have opted to upgrade the design to using an Intel Core i7 4770HQ processor that features more powerful Intel Iris Pro 5200 Graphics, which isn't as powerful as a discrete NVIDIA GPU but at least is more open-source friendly. Read more