The RV770 Story: Documenting ATI's Road to Success
by Anand Lal Shimpi on December 2, 2008 12:00 AM EST- Posted in
- GPUs
The Bet, Would NVIDIA Take It?
In the Spring of 2005 ATI had R480 on the market (Radeon X850 series), a 130nm chip that was a mild improvement over R420 another 130nm chip (Radeon X800 series). The R420 to 480 transition is an important one because it’s these sorts of trends that NVIDIA would look at to predict ATI’s future actions.
ATI was still trying to work through execution on the R520, which was the Radeon X1800, but as you may remember that part was delayed. ATI was having a problem with the chip at the time, with a particular piece of IP. The R520 delay ended up causing a ripple that affected everything in the pipeline, including the R600 which itself was delayed for other reasons as well.
When ATI looked at the R520 in particular it was a big chip and it didn’t look like it got good bang for the buck, so ATI made a change in architecture going from the R520 to the R580 that was unexpected: it broke the 1:1:1:1 ratio.
The R520 had a 1:1:1:1 ratio of ALUs:texture units:color units:z units, but in the R580 ATI varied this relationship to be a 3:1:1:1. Increasing arithmetic power without increasing texture/memory capabilities; ATI noticed that shading complexity of applications went up but bandwidth requirements didn’t, justifying the architectural shift.
This made the R520 to R580 transition a much larger one than anyone would’ve expected, including NVIDIA. While the Radeon X1800 wasn’t really competitive (partially due to its delay, but also due to how good G70 was), the Radeon X1900 put ATI on top for a while. It was an unexpected move that undoubtedly ruffled feathers at NVIDIA. Used to being on top, NVIDIA doesn’t exactly like it when ATI takes its place.
Inside ATI, Carrell made a bet. He bet that NVIDIA would underestimate R580, that it would look at what ATI did with R480 and expect that R580 would be similar in vain. He bet that NVIDIA would be surprised by R580 and the chip to follow G70 would be huge, NVIDIA wouldn’t want to lose again, G80 would be a monster.
ATI had hoped to ship the R520 in early summer 2005, it ended up shipping in October, almost 6 months later and as I already mentioned, it delayed the whole stack. The negative ripple effect made it all the way into the R600 family. ATI speculated that NVIDIA would design its next part (G71, 7900 GTX) to be around 20% faster than R520 and not expect much out of R580.
A comparison of die sizes for ATI and NVIDIA GPUs over the year, these boxes are to scale. Red is ATI, Green is NV.
ATI was planning the R600 at the time and knew it was going to be big; it started at 18mm x 18mm, then 19, then 20. Engineers kept asking Carrell, “do you think their chip is going to be bigger than this?”. “Definitely! They aren’t going to lose, after the 580 they aren’t going to lose”. Whether or not G80’s size and power was a direct result of ATI getting too good with R580 is up for debate, I’m sure NVIDIA will argue that it was by design and had nothing to do with ATI, and obviously we know where ATI stands, but the fact of the matter is that Carrell’s prediction was correct - the next generation after G70 was going to be a huge chip.
If ATI was responsible, even in part, for NVIDIA’s G80 (GeForce 8800 GTX) being as good as it was then ATI ensured its own demise. Not only was G80 good, but R600 was late, very late. Still impacted by the R520 delay, R600 had a serious problem with its AA resolve hardware that took a while to work through and ended up being a part that wasn’t very competitive. Not only was G80 very good, but without AA resolve hardware the R600 had an even tougher time competing. ATI had lost the halo, ATI’s biggest chip ever couldn’t compete with NVIDIA’s big chip and for the next year ATI’s revenues and marketshare would suffer. While this was going on, Carrell was still trying to convince everyone working on the RV770 that they were doing the right thing, that winning the halo didn’t matter...just as ATI was suffering from not winning the halo. He must’ve sounded like a lunatic at the time.
When Carrell and crew were specing the RV770 the prediction was that not only would it be good against similarly sized chips, but it would be competitive because NVIDIA would still be in overshoot mode after G80. Carrell believed that whatever followed G80 would be huge and that RV770 would have an advantage because NVIDIA would have to charge a lot for this chip.
Carrell and the rest of ATI were in for the surprise of their lives...
116 Comments
View All Comments
MrSpadge - Saturday, December 6, 2008 - link
Exactly what I was thinking! That's why I got a 8500LE back then, when Geforce 4 was not in (public) sight yet.FireSnake - Wednesday, December 3, 2008 - link
... which one is Anand (on the picture at the beginning of the article)?I always wondered how he looks like ... I guess the one on the right.
3DoubleD - Wednesday, December 3, 2008 - link
I've had Anandtech as my home page for 5 years and I've read almost every article since (and even some of the older ones). This is by far one of your greatest works!Thanks
hellstrider - Wednesday, December 3, 2008 - link
Kudos to Anand for such a great article, extremely insightful. I may even go out and purchase AMD stock now :)I love AMD even when it’s on the bottom, I own 780G + X2 + hd4850, in hopes that Deneb (or AM3 processors for that matter) will come in time to repeat the success of rv770 launch, at which point I will upgrade my obsolete X2 and have a sweet midrange machine.
My only concern is that Nvidia is looking at all this smirking and planning an onslaught with the 55nm refresh. There is a very “disturbing” article at Xbitlabs that Nvidia is stock-piling the 55nm GT200 parts; seems like that’s something they would do – start selling those soon and undercut 4800 series badly.
I’m just a concerned hd4850 owner and I don’t want to see my card obsolete within couple of months. I don’t really see AMD’s answer to 55nm GT200 in such short period of time?!?!
Any thoughts?
Goty - Wednesday, December 3, 2008 - link
I don't think you'll have to worry too badly about the 55nm G200s. NVIDIA won't drop prices much, if at all; they're already smarting from the price drops enacted after the RV770 launch. There's also the fact that the 4850 isn't in the same market space as any of the G200 cards, so they're not really competitive anyhow.ltcommanderdata - Wednesday, December 3, 2008 - link
I always imagined designing GPUs would be very stressful given you're trying to guess things years in advance, but this inside look at how things are done was very informative.On GDDR5, it's interesting to read that ATI was pushing so hard for this technology and they felt it was their only hope for the RV770. What about GDDR4? I thought ATI was a big supporter of it too and was the first to implement it. I'm pretty sure Samsung announced GDDR4 that could run at 3.2GBit/s in 2006 which isn't far from the 3.6GBit/s GDDR5 used in the 4870, and 4GBit/s GDDR4 was available in 2007. I guess there are still power savings to be had from GDDR5, but performance-wise I don't think it would have been a huge loss if GDDR5 had been delayed and ATI had to stick with GDDR4.
And another interesting point in your article was definitely about the fate of the 4850. You report that ATI felt that the 4870 was perfectly specced and wasn't changed. I guess that meant they were always targeting the 750MHz core frequency that it launched with. Yet ATI was originally targeting the 4850 at 500MHz clock. With the 4870 being clocked 50% faster, I think it should be obvious to anyone just looking at the clock speed that there would be a huge performance gap between the 4850 and 4870. I believe the X1800XL and X1800XT had a similarly large performance gap. Thankfully Dave Baumann convinced them to clock the 4850 up to a more reasonable 625MHz core.
One thing that I feel was missing from the article was how the AMD acquisition effected the design of the RV770. Perhaps there wasn't much change or the design was already set so AMD couldn't have changed things even if they wanted to, but they must have had an opinion. AMD was probably nervous that they bought ATI at it's height when the R580 was out and top, but once acquired, the R600 came out and underperformed. Would be interesting to know what AMD's initial opinion of ATI's small die, non-top tier targetted strategy was although it now seems to be more consistent with AMD's CPU strategy since they aren't targeting the high-end there anymore either.
hooflung - Wednesday, December 3, 2008 - link
The final frontier market share wise is to steal a major vendor like eVGA. If they can get an eVGA, BFG or XFX to just sell boards with their warranties AMD would be really dominant.JonnyDough - Wednesday, December 3, 2008 - link
The best thing I've ever read on a tech site. This is why you're better than THG.Only one typo! It was a "to" when it should have been a "too."
Chalk one up for the red team. This makes my appreciation for AMD rise even more. Anyone willing to disclose internal perspectives about the market like this is a team with less secrecy that I will support with my hard earned cash. So many companies could stand up and take a lesson here from this (i.e. Apple, MS).
Keep articles like this coming, and I'll keep coming back for more.
Sincerely,
~Ryan
epyon96 - Wednesday, December 3, 2008 - link
I have been an avid reader of this site for close to 8 years. I used to read almost every CPU, GPU and novelty gadget articles page to page. But over the years, my patience is much lower and I realize I get just as much enjoyment and information from just reading the first page and last page and skimming a few benchmarks.However, this is the first article in a while that I spent reading all of it and I thoroughly enjoyed it. These little back stories with a human element in one of the most interesting recent launches provides a refreshing change from boring benchmark-oriented articles.
I hope to find an article based on Nehpalem of a similar nature and other Intel launches.
GFC - Wednesday, December 3, 2008 - link
Wow, all i can say is that i loved this review. It was realy enjoyable to read, and i must give my thanks to Anandtech and Carrell!