Understanding AMD's Roadmap & New Direction
by Anand Lal Shimpi on February 2, 2012 6:16 PM EST- Posted in
- CPUs
- AMD
- Trade Shows
- AMD FAD 2012
ARM & The Future
Thankfully, Rory isn't HPing the company. AMD will continue to build its own x86 CPUs and GCN (and future) GPUs. The difference is that AMD will now consider, where it makes sense, using other architectures. AMD didn't come out and say it, but it's clear that the other ISA under consideration is designed by ARM. In the markets where it makes sense, AMD might deliver an ARM based solution. In others it may deliver an x86 based solution. The choice is up to the market and customer, and AMD is willing to provide either.
What's most interesting is that AMD was very clear about not wanting to be in the smartphone market. It believes, at least today, that the smartphone SoC market is too low margin to make financial sense. With smartphone SoCs selling for under $20 and given how hard it has been for Intel and NVIDIA to break into that market, I don't blame AMD for wanting to sit this one out. However, smartphones have been a huge success for ARM. If AMD is to offer ARM based SoCs coupled with their own CPU/GPU IP in other markets, it's unclear what the reception will be. The flexibility is definitely appreciated and it's a far more defensible position than saying that all future products have to use x86, but simply embracing ARM isn't a guarantee for success.
Rory Read presented a vision of the future where a large, vertically integrated device manufacturer may want to deliver custom silicon for everything from tablets to notebooks to TVs. AMD's goal is to be able to provide silicon to companies like this, while differentiating based on its own internal IP (x86 CPUs, GPU cores). One current example would be Microsoft's Xbox 360. AMD designed much of the silicon for that console, although it's using 3rd party CPU IP. In other words, should a customer want an ARM based solution mated with an AMD GPU, they could have one. If a customer wanted a strange x86/ARM APU, that would be a possibility as well.
AMD did a good job outlining that it would be more agile and flexible, however it didn't outline what specific products we'd see that implement this new architecture agnostic mentality. I suspect AMD's lack of specific examples is a result of the simple fact that the new management team has only been in place for a handful of months. It will take a while to develop outlines for the first products and a clear roadmap going forward. Until then, it's all about executing on the APU, GPU and server CPU fronts.
84 Comments
View All Comments
spidey81 - Thursday, February 2, 2012 - link
With AMD's focus going the direction of mobile/AIO or server parts will the consumer market ever see anything directed at the desktop enthusiast marker? I guess I'm still hopeful to see a trickle down from the server marker or desktop innovation trickling down to the mobile sector as has been in the past. Maybe it's just time to jump ship to intel for my next gaming/oc rig.arjuna1 - Thursday, February 2, 2012 - link
You ninja'd my post, but, exactly the same feeling, but you know what?? if they dare to give me the middle I will give them the middle finger, no problem in making my next build intel/nvidiaspidey81 - Thursday, February 2, 2012 - link
I just upgraded from a PII X3 720 to an FX8120. It's frustrating to know that even with it clocked at 4.5Ghz I'm still not going to get the performance I would have with a 2500K. I've never built with anything other than AMD and really don't plan on changing that. However, It's getting increasingly difficult to support them.just4U - Thursday, February 2, 2012 - link
Hey Spidey.. your not missing to to much. I've built several i5/i7 setups and use one everyday. But I've also picked up an FX6100 and it's pretty good to. I don't mind switching back and forth and while there may be a slow down in some games .. some apps.. I don't notice it unless I am actually looking at the numbers. They all seem fast overall.Sabresiberian - Thursday, February 2, 2012 - link
My experience is different. I have built 2 computers, one on the i7 920 and one on the Phenom II 955. The difference is clear and significant playing World of Warcraft, and any other MMORPG.My experience is mirrored by Anandtech and Tomshardware benchmarks.
Now, if you aren't a person that actually uses all the performance he can get, the Phenom II is fine, but even having been an AMD fan for years, I won't go back and cut my nose off to spite my face, as they say.
AMD has chosen a different path than I would like for them to have, but I'm not going to fault them for it. I'm disappointed as an enthusiast builder, but I certainly recognize there is a far wider market than CPUs for people like me. However, it also means they no longer are interested in supplying what I want, so we must part ways.
;)
Spoelie - Friday, February 3, 2012 - link
Have been an AMD/ATI loyalist for a long time, and have only built AMD/ATI setups *for my personal use*. But I always only upgraded to a product that was either very competitive (Tbird, A64, X2, PhII early on) or dirt cheap and very overclockable (Tbred, Barton for example) - holding out the times AMD wasn't very competitive (kept my X2 pretty long, skipped PhI).The thing is that my 3 year old DDR2 Deneb@3.3ghz has never felt inadequate at all, helped by an SSD and yearly GPU upgrades.
When the time comes however, I'll have no qualms switching to Intel's latest and greatest, in the same spirit as Sabresiberian
wumpus - Wednesday, June 27, 2012 - link
>The difference is clear and significant playing World of Warcraft, and any other MMORPG.WoW was released in something like 2007. I very much doubt a modern CPU would notice the difference (I used to play Dungeons and Dragons online (2008) with a 2GHz Sempy, and it ran just fine). Methinks you have different GPUs and that might just make the difference (WoW used to be famous for not stressing the GPU, I doubt they have changed it).
Still, as someone who has always liked AMD more than Intel, I suspect I will wind up buying more Intel processors in the future (the fact that every single Intel processor I've bought has been deliberately crippled annoys the bejesus out of me).
Face it, the desktop is "dying" (read becoming a mature tech that doesn't obsolete itself every Thursday). Don't expect every high tech company to want to swoop down and grab a piece of the pie anymore. Intel will have a hard enough time with every "tock" competing with the previous "tick".
GotThumbs - Friday, February 3, 2012 - link
I think one of the key factors your leaving out...Is what is the cost/price difference. I've built all AMD systems since my first Pentium II build, and been quite happy with the system and the performance I've gotten, while still having some cash left in my pockets. I'm even looking at down-sizing my system to an APU on an ITX board with an SSD. Today's CPU's meet probably 95% of the markets needs. It's only a select few who need hard-core performance on an hourly basis and can justify spending huge amounts of money to have a high-powered system. Higher CPU speeds is not the only focus in today's market. Battery life and user experience is what matters. If you can get the same experience with a lower speed processor...then whay pay more....bragging rights only takes you so far.I think AMD has matured and is no longer concerned with competing with Intel on having the biggest and baddest CPU's. Most general consumers barely use 1/3 the capacity of their systems.
bill4 - Thursday, February 2, 2012 - link
Its funny how people hate AMD so much they automatically push this "AMD is getting out of the high end!" agenda in post along the internet. It's not commentary, it's your hope.Nvidia is the only one getting out of the high end since they dont even have a competitor to Tahiti.
Amd Bulldozer was definitely a play at the high end, it's a huge ambitious chip, it just sucks.
Get your head out of your ass Nvidia fanboys, AMD is not going anywhere no matter how hard you wish it/
arjuna1 - Thursday, February 2, 2012 - link
Hey bill, I've been building AMD/ATI since the K6/9800, why don't you just stfu, learn to read before opening your mouth.