Matrox to Use AMD GPUs in Their Next Generation Multi-Display Graphics Cards
by Jarred Walton on September 3, 2014 8:20 PM ESTIf you go back far enough in the computer industry, there have been many successful video card companies. Back before the whole 3D craze kicked off, some of the fastest 2D video cards came courtesy of Matrox, and while they made some attempts at producing compelling 3D graphics cards, they were never able to grab the performance crown from NVIDIA or ATI. Their last real attempt at the 3D graphics market came in 2002 with the Parhelia-512, and as was the case with previous efforts it basically ended up falling short. Interestingly, the Parhelia-512 supported "surround gaming" long before AMD's Eyefinity, and that may have opened the gates for what would become Matrox's core focus over the next decade: multi-display video cards.
Since 2002, there haven't been many reviews of Matrox cards because the focus shifted to industries that need not just two or three but potentially a dozen or more displays all running from a single system. Their last graphics card update was in 2009, and since then the top product has been the M9188, a single card capable of driving eight DisplayPort or DVI connections, with the possibility of using two cards to drive 16 displays. Who needs that many displays? Well, the financial and security markets are two easy examples, as they both have use cases where six or more displays is "reasonable", and digital signage is another category where Matrox can provide useful technology. These are all professional markets, and the M9188 is priced accordingly ($1500+), but if you were looking to build a system with good graphics performance, Matrox basically hasn't been relevant as their cards seem to focus almost exclusively on 2D performance these days.
That might be changing with future products given today's announcement, as Matrox will be switching to AMD-designed GPUs for their next generation of multi-display products. These will continue to support Matrox's PowerDesk desktop management software, but what's not clear is whether Matrox will be doing much in the way of customized hardware. The announcement states that "key features of the selected AMD GPU include 28nm technology with 1.5 billion transistors; DirectX 11.2, OpenGL 4.4 and OpenCL 1.2 compatibility; shader model 5.0; PCI Express 3.0 and 128-bit memory interface."
From that we can surmise that Matrox will be using a variant of the Cape Verde GCN core, which is one of the lower performance GCN parts from AMD. In fact, Matrox may actually be using AMD's FirePro W600 cards, only with custom Matrox-developed software applications. This would also mean Matrox is looking at a maximum of six display outputs per graphics card (compared to eight on the M9188), but AMD already has the ability to run up to six GPUs in a system with the appropriate motherboard meaning up to 36 displays off a single system is theoretically possible.
The hardware is of course only part of the equation, and Matrox's PowerDesk software is something that benefits many businesses and professionals. Matrox notes that "critical productivity-enhancing features available with Matrox PowerDesk software will continue to be supported on the next line of Matrox graphics cards designed with AMD GPUs." These features include the ability to configure and manage multi-display setups, which can get tricky once you move past two or three displays. PowerDesk has tools to configure stretching, cloning, pivot, bezel management, and other items that are important for a professional multi-display configuration.
There are plenty of upsides to this announcement. For one, it allows Matrox to reallocate resources that are currently going into hardware development and instead focus on their core competency, which at this point is multi-display solutions. PowerDesk is well regarded in their target market, and this will allow Matrox to continue to improve the platform without trying to design their own hardware. AMD benefits as they're able to partner with Matrox and potentially sell their GPUs at higher "professional" prices, and they may also increase their share of digital signage and other multi-display markets.
And of course the customers that purchase the cards benefit as they get to move to a modern platform with support for all the latest DirectX, OpenGL, and OpenCL libraries. Long-term, this also opens the doors for Matrox to offer substantially higher performance 3D solutions from AMD for customers that need such features. Overall, this announcement isn't likely to affect most computer users, but it's good to see Matrox still hanging around after several decades in the computer graphics industry, something many of their competition from the 90s didn't manage to achieve.
Source: Matrox PR
36 Comments
View All Comments
SeannyB - Wednesday, September 3, 2014 - link
Now there's a name that evokes some nostalgia. A friend of mine had one of those Matrox Parhelia triple-head setups which was pretty novel in the days before ubiquitous 27" 1440p displays. And also the 64-bit color rendering was a sight to see, at least in Matrox's own tech demos. Of course the performance wasn't all that, but image quality was Matrox's emphasis. I myself owned a G400 Max (1999) and I remember that being the case as well-- mediocre performance, fantastic image quality.SeannyB - Wednesday, September 3, 2014 - link
Or no, it was 10-bit-per-channel color, now that I'm skimming Anand's old review.B3an - Thursday, September 4, 2014 - link
I always thought that one of the main features for Matrox was better image and colour output compared to AMD or Nvidia (like higher bit and more accurate). This article doesn't mention anything about that. So how does this affect colour now that they will use AMD GPU's?Senti - Thursday, September 4, 2014 - link
Likely all AMD cards are capable of 10-bit color, so there should be no concerns for this area. "Image quality" in digital days is just bit-exactness (especially in 2D), so should be no problems here either.I hope that Matrox could give AMD some push to improve their quite pathetic workstation drivers. It's mostly bugs that annoy people today, not features or speed of GPUs.
silverblue - Thursday, September 4, 2014 - link
They weren't bad (as you said, excellent image quality, and two RAMDACs), it's just the price hurt. A friend had the G400 and it outperformed my Savage4 Pro quite nicely which certainly helped in Unreal Tournament.The Parhelia... eek. Well, the potential was there, but underperforming most of the Ti-4xxx line wasn't exactly impressive. Drivers did improve things over time, but the card was too expensive to compete.
ddriver - Thursday, September 4, 2014 - link
Is there any nostalgia for the abysmal 3D performance of an insanely expensive product?LemmingOverlord - Wednesday, September 3, 2014 - link
Matrox was tremendous when it came to 2D and image quality, never having actually broken into the 3D market proper. Their biggest asset was their software, rather than their hardware. Most corporate PCs came with inexpensive Matrox 3D cards, which were good for basic computing but rubbish at any type of 3DI really don't see how a company like Matrox remains competitive, tho'. By becoming an AMD vendor they could just be signalling they've reached the end.
ToniCipriani - Wednesday, September 3, 2014 - link
It's kinda sad, actually. I actually still remember the MGA Millenium and Mystique cards. Having one of those back then was like luxury.caseyse - Thursday, September 4, 2014 - link
When I purchased my MGA Millenium, with its 4MB VRAM memory expansion board (6MB RAM total) it was the top performer at the time. A couple generations prior to this card, I was using their blazing fast Mach32 card. I remember the scandal Matrox found itself in when it was discovered they had written a routine in their firmware to recognize the leading graphics card benchmark at the time in order to produce favorable results.Creig - Thursday, September 4, 2014 - link
Pretty sure Nvidia had them beat:http://www.kickassgear.com/downloads/3dmark03_audi...