ATI's MultiVPU solution, don't get caught in the crossfire?

Filed under
Hardware

We'll have to give it to ATI for keeping our gaze fixed upon their new product for so long, whilst being fed all sorts of incomplete information about prospective performance and features. Now that the curtain has dropped on ATI's multi graphics processor solution we can only wonder what they have been doing for the past six months. Initially ATI commented that their solution would be a flexible and elegant one, and would, for example, work on any motherboard that has two PCIe slots, regardless of configuration. We would also be able to combine any two ATI PCIe graphics cards and get a boost in performance.

ATI was also quick to comment on NVIDIA's solution being a cumbersome one, requiring a special SLI motherboard, two identical graphics cards and last but not least an internal SLI connector to establish communication between the two cards. From looking at the ATI Crossfire solution they managed to eliminate none of these "drawbacks" as their solution has about the same requirements as NVIDIA's. You will also need a new motherboard sporting an ATI chipset with Crossfire support, a "master" graphics card that will work with any 2nd ATI PCIe graphics card and last, but not least, an external dongle to enable the two cards talk to each other.

So we are left scratching our heads, exactly how is this solution more elegant and flexible than NVIDIA's? At least NVIDIA's solution works with any 6800 or 6600 series graphics card, the Crossfire solution requires the purchase of a +$500 master card, so much for flexibility. And what's with that external dongle? An internal connector to establish communication and freeing the bracket of cable clutter and enabling a 2nd DVI or S-Video output is a far more elegant solution. By the looks of it the affordable SLI alternative that Crossfire was pitched as a few months ago has now turned into an expensive and not at all flexible solution that does not offer anything substantial over NVIDIA's. For the time being we'd suggest you stick with NVIDIA's solution and don't get caught in the crossfire.

Sander Sassen.