Intel’s Dual-Core Xeon First Look
by Jason Clark & Ross Whitehead on December 16, 2005 12:05 AM EST- Posted in
- IT Computing
How does power consumption affect the bottom line?
We've illustrated each system's power consumption levels at different load levels. For the technical folks, those numbers are important. But, for the platform decision maker, the CTO, Director of Technology and any person in a role that looks at the bigger picture, it's important to illustrate how it affects the bottom line. Obviously, if you use more power, it costs you more money. To illustrate this, we used a current rate of 14 cents per kWh, which was taken from a power bill of a resident in Connecticut, New York. We then used the data to extrapolate the cost of each platform at the various load levels.
At 40-60% load, it costs $42 a month to run a Bensley system while an Opteron system would cost $25. Factor that over a year, and a Bensley system would cost $504 to run and an Opteron would cost $300. Now, if that doesn't pique your interest, let's say that you have 40 servers at a datacenter, with the same power characteristics. Over a year, it would cost you $8,160 more to run the Bensley system.
So, you've read through the article and are waiting for us to tell you what platform is better. That entirely depends on what matters to you: performance, power, or both. If all you care about is performance, then Bensley is that platform. If you care about how much power you consume, then Opteron is that platform. Now, if you want the best performance per Watt, then Opteron is that platform. At least that is what the database test results that we've shown here dictate. Obviously, given Intel's roadmap, they are developing platforms to address performance per Watt. Woodcrest will be the first product that is focused on performance per Watt, which we will see in the second half of 2006. Until then, the choice is yours.
We've illustrated each system's power consumption levels at different load levels. For the technical folks, those numbers are important. But, for the platform decision maker, the CTO, Director of Technology and any person in a role that looks at the bigger picture, it's important to illustrate how it affects the bottom line. Obviously, if you use more power, it costs you more money. To illustrate this, we used a current rate of 14 cents per kWh, which was taken from a power bill of a resident in Connecticut, New York. We then used the data to extrapolate the cost of each platform at the various load levels.
At 40-60% load, it costs $42 a month to run a Bensley system while an Opteron system would cost $25. Factor that over a year, and a Bensley system would cost $504 to run and an Opteron would cost $300. Now, if that doesn't pique your interest, let's say that you have 40 servers at a datacenter, with the same power characteristics. Over a year, it would cost you $8,160 more to run the Bensley system.
Conclusion
So, you've read through the article and are waiting for us to tell you what platform is better. That entirely depends on what matters to you: performance, power, or both. If all you care about is performance, then Bensley is that platform. If you care about how much power you consume, then Opteron is that platform. Now, if you want the best performance per Watt, then Opteron is that platform. At least that is what the database test results that we've shown here dictate. Obviously, given Intel's roadmap, they are developing platforms to address performance per Watt. Woodcrest will be the first product that is focused on performance per Watt, which we will see in the second half of 2006. Until then, the choice is yours.
67 Comments
View All Comments
gjmck - Monday, December 19, 2005 - link
I'm curious that the numbers dont reflect the true difference between equivalently configured Intel vs. Opteron systems.The Dempsey processor TDP max is 130W and the Opteron is 95W. That difference is only 35W. The memory controller needed by Dempsey should only consume 60 - 80W. Using 80W that gives the maximum total difference between two eqivalently configured systems as 80 + 35 = 115W.
Yet in the max processor utilization tests the difference was 214 Watts. So where is the extra 99 Watts being used? FBD? If so then when Opteron uses similar memory technology the delta will not be as great.
Gregg McKnight
Furen - Thursday, December 22, 2005 - link
Intel's TDPs reflect "typical" power draw, while AMD's reflects the "worst-case scenario" power consumption, so they're not directly comparable. I very much doubt the memory controller uses even close to 80W, I'd say something like 20-30W for the whole northbridge is reasonable. FB does use more power, but that shouldn't be more than 5-10W per dimm. The rest is just the CPU being insanely power-inefficient.dannybin1742 - Friday, December 16, 2005 - link
to keep anthing at a constant temperature, the heat going into the system must equal the heat being taken away. so if one system uses 200W of power, first you have the cost of the 200W, then you have the cost to remove the 200W of heat given off by the use of the system. on top of this air, conditioners are 20-25% efficient at best (if i remember correctly), so the amount of power used to remove the heat generated would take 3-4X more energy to remove. so in essence you are looking at at LEAST 2X amount of money calculated in the article. (i took a year of thermodynamics at school here, when i was an undergrad) in reality, you are probably looking at 4-6X to run and remove the heat from the data center. they should have looked at the opteron 2.2ghz HE (low voltage) i'd be interested to see what power numbers those put up.also, was winxp 2003 server 64 bit? or were all the tests run in 32 bit? i just skimmed over the article. how about linux?
coldpower27 - Friday, December 16, 2005 - link
Opteron 270 HE is the highest of the lower wattage 2 Way Opterons and it runs at 2.0GHZ.
Viditor - Friday, December 16, 2005 - link
You mean 2 way dual core...
The 250 HE is single core at 2.4 GHz...
coldpower27 - Friday, December 16, 2005 - link
Yes, I assume 2 Dual Core vs 2 Dual Core.haris - Friday, December 16, 2005 - link
One question that kept nagging me was "How many "threads" were required to get the systems to each load level?" How much of a difference would it make to performance/watt if you have to take into account that processor 1 is also handling x% more/less threads then processor 2?Jason Clark - Friday, December 16, 2005 - link
That will teach me for just taking a $1,000 measurement devices reported figures :) It actually figures out the cost, which obviously was wrong. I've updated the numbers, they should be correct.Again, sorry :)
coldpower27 - Friday, December 16, 2005 - link
Thanks alot.:)Biffa - Friday, December 16, 2005 - link
With over a 1Ghz defecit (yes I know) in processor speed, and with only 1MB of cache per core rather than 2MB, I think we can safely say that Intel is still clutching at straws at this level of the game.Good PR on their part (always admired them for that) however its a crying shame that after all this time this is the best they can do.