Investigating NVIDIA's BatteryBoost with MSI GT72
by Jarred Walton on October 23, 2014 9:00 AM ESTA Closer Look at Clock Speeds and Power
Wrapping things up, while we've shown that BatteryBoost can certainly improve battery life, there's still the question of what exactly NVIDIA is doing behind the scenes. We know they're playing with the maximum FPS of course, but a frame rate cap alone isn't (always) able to match what BatteryBoost can deliver. To try and shed some additional light on what's going on internally, I logged performance data while running our three BatteryBoost gaming tests. This time, however, the goal was not to fully drain the battery but rather to try and find out what's going on in terms of clock speeds and power draw at a lower level; that means the tests were shorter and there may be more variance, but the numbers are generally in agreement.
There are four tests for each game where I logged data: AC power is the baseline, then I tested DC power without BatteryBoost, with BatteryBoost and a 60FPS target, and finally with BatteryBoost and a 30FPS target. I also tested all for settings with and without VSYNC. I won't guarantee the numbers are 100% accurate, as I have to rely on a utility to report clock speeds and other items, so I won't create any potentially misleading charts; nonetheless, the results are rather interesting to discuss.
First, under AC power the CPU is basically left to run free, and in most cases it will run near its maximum Turbo Boost clocks (3.2-3.3GHz); it also consumes quite a bit of power (25-35W) when VSYNC is off. The GTX 980M meanwhile is running basically full tilt (1100MHz plus or minus ~25MHz on the Core thanks to GPU Boost 2.0, and 5000MHz RAM). Turning VSYC on gives us a taste of things to come, however: average CPU clocks are typically much lower (1800-2000MHz, with spikes up to 3400MHz and lows of 800MHz) and average CPU package power is likewise substantially lower (10-30W). The GPU clocks don't change much, but GPU utilization drops from close to 100% (95-99%, depending on the game) to 32-55%. Switch to battery power and things start to get a bit interesting.
Let's discuss the three games I tested in turn, starting with Tomb Raider. The CPU clock speeds and power tend to vary substantially based on the game, and the GPU varies a bit as well though not as much as the CPU. Even without BatteryBoost, CPU clocks are often at their lowest level (800MHz), and turning on VSYNC actually resulted in higher average CPU clocks but lower average CPU power – the logging data may not be capturing fully accurate CPU clocks, though I suspect the power figures are pretty accurate. GPU clocks show some similarly odd behavior: without VSYNC the average GPU clock was 479MHz with 3200MHz GDDR5, but utilization is at 97%; with VSYNC the average GPU clocks are a bit higher (~950/3200 core/RAM) but utilization is just under 52%.
Enabling BatteryBoost with 60FPS and 30FPS targets continues to generate somewhat unexpected results. At 60FPS, the CPU is generally close to the base 800MHz, but it does average slightly higher when VSYNC is on; power draw from the CPU is pretty consistent at around 6.1-6.5W for the package. Average GPU clocks meanwhile make a bit more sense (they're slightly lower with VSYNC enabled), while average GPU utilization is slightly higher with VSYNC. Overall, however, system power use is much lower with BatteryBoost than without, which is what we'd expect from our earlier battery testing results. It looks like in Tomb Raider the GPU (plus the rest of the system except for the CPU) draws around 60-65W without BatteryBoost, and that drops to 50-55W with BatteryBoost at 60FPS. Our 30FPS BatteryBoost numbers meanwhile don't show a significant change in CPU clocks (still close to the minimum 800MHz), but with the lower FPS the CPU doesn't have to work as hard so CPU package power is now down to around 4.6-4.7W. On the GPU front, the core clocks are around 670-700MHz with close to 50% utilization, but the GDDR5 memory is now running at 1620MHz, so there are some definite power savings there. Average power draw from the GPU and system (again, minus the CPU) is around 35-40W.
Borderlands: The Pre-Sequel behaves quite differently on battery power. The AC results are about the same (CPU and GPU basically running as fast as they can), but now on DC power without BatteryBoost the CPU continues to run at relatively high clocks (3.0-3.4GHz), and as you'd expect power draw remains pretty high as well (20-25W). With BatteryBoost at 60FPS, VSYNC actually had substantially higher CPU clocks (and CPU power use – 14.6W with VSYNC compared to 11.2W without, though the test didn't last as long so there's more chance for variance), but at 30FPS things start to look a lot more like Tomb Raider: the CPU runs at 800-1500MHz, with a 1.0GHz average with VSYNC and 1.125GHz average without; CPU power is 6-7W as well (slightly lower with VSYNC). As for the GPU, things aren't all that different; there's a hard cap of 3.2GHz on the GDDR5 when running off the battery, and while the 980M is frequently at that mark when striving for 60FPS, it's mostly at 1620MHz on the 30FPS setting. The GPU (and system other than CPU) draw close to 50W at 60FPS and 35W at 30FPS, while running without BatteryBoost puts things closer to 60W.
With GRID Autosport, the results on AC power and on DC without BatteryBoost are basically similar to the other two games, though the CPU apparently isn't working as hard as in Borderlands. On AC power it uses 35W and that drops to 23W with VSYNC; on DC without BatteryBoost the CPU is drawing 25W and 15W with VSYNC. The GPU plus other system components meanwhile look to be drawing around 66W without BatteryBoost and 56W with VSYNC enabled. Turn on BatteryBoost and again at 60FPS we see higher CPU clocks (and higher CPU power use) when VSYNC is enabled, but we're talking about 10.7W without VSYNC and 13.7W with VSYNC, and apparently other factors can make up for the difference. The GPU and other components draw around 42W without VSYNC and 39W with VSYNC, so it balances out. Last but not least, at 30FPS the CPU package power averages ~7.3W without VSYNC and ~7.8W with VSYNC, while the GPU and remaining components use 35.7W without VSYCN and 31.8W with VSYNC.
Based on our testing of three different games, it appears BatteryBoost is most effective in games that don't hit the CPU as hard, though with caveats. Tomb Raider for example is known to be pretty easy on the CPU (i.e. a slower AMD APU would likely get close to the same frame rates as a fast Core i7 when paired with the same GPU). However, the type of calculations each game uses (including AI) mean that in some cases a game that doesn't appear to be very CPU intensive may still draw a fair amount of power from the CPU. In general, it looks like the GTX 980M under most gaming workloads will draw at least 25-30W of power (and another 5W or so for the motherboard, RAM, LCD, etc.), which means the lower the CPU load the better. In some cases it should be possible to get the entire GT72 notebook close to 35W while gaming, which would mean the 87Wh battery might last up to nearly 2.5 hours; more realistically, I'd expect most games will pull 40-45W even at the 30FPS target with BatteryBoost, which equates to 1.9 to 2.2 hours at most. Obviously if you have a game that's more taxing (e.g. Metro: Last Light), you'll get even less battery life.
With that said, one other interesting piece of information is that in our Light battery test (Internet surfing) using the same Balanced power profile, with the GTX 980M enabled the GT72 manages around 220 minutes of mobility. (Our Heavy battery test drops it to 165 minutes, if you're wondering.) While two hours of gaming isn't going to be enough for a LAN party, it's still quite impressive to see the GTX 980M effectively drawing about as much power as a GT 750M when BatteryBoost is enabled – though in most cases it's also providing roughly the same level of performance of the GT 750M (under AC power).
26 Comments
View All Comments
WinterCharm - Thursday, October 23, 2014 - link
So it's essentially V-sync to 30 fps :PIII-V - Thursday, October 23, 2014 - link
It's a bit more than that. Read the article.spencer_richter - Tuesday, November 25, 2014 - link
It's not as good as the top laptops on the market (see the rankings at ). For example the ASUS ROG G750JM-DS71 is a lot better for gaming. <a>http://www.consumer.com/</a> <a href="http://www.consumer.com/">http://www.consu... <a href="http://www.consumer.com/" title="http://www.consumer.com/">nathanddrews - Thursday, October 23, 2014 - link
With regular, old, dumb v-sync, additional frames are still rendered by the GPU, but select frames are only delivered from the frame buffer when ready to be synchronized to the monitor - it's not very efficient. BatteryBoost attempts to render only 30fps (or whatever the target is) to save power, and appears to succeed... somewhat.looncraz - Thursday, October 23, 2014 - link
Not on my system for the games I've tried. VSync reduces my GPU usage while reducing frame rate. The again, I've only tried a few games...But my own rendering engine accumulates (and even merges) changes until the next rendering time window, as directed by either the screen refresh or processing capability. (i.e. the render control thread doesn't initiate a frame render until the monitor can show it if VSync is enabled, or immediately once the last frame is completed if it is disabled). There just isn't a logical reason to do it any other way.
nathanddrews - Thursday, October 23, 2014 - link
I wonder if power usage is at all related to the "pre-render max frames" setting?OrphanageExplosion - Thursday, October 23, 2014 - link
Assuming there are battery boost profiles for each game, couldn't it simply be dialling down quality settings where you're not likely to be able to tell the difference from, say, high quality shadows and normal quality shadows?JarredWalton - Thursday, October 23, 2014 - link
Note that I did not run with the "recommended" BatteryBoost settings for the various games; I ran with specific settings and kept those constant. GeForce Experience does have suggestions that sometimes match my settings...and sometimes not. :-)inighthawki - Thursday, October 23, 2014 - link
By default, at least on Windows, this is not true. When vsync is enabled, frames are queued to be presented at a particular interval. They are never discarded. This queue has a max height - typically 3 frames, but normally configurable by the game. After three frames, any present calls by the game will be blocked on the thread until a VBlank occurs and a frame is consumed.It is possible to get the behavior you're referring to if the target operating system supports it, and the game uses triple buffering. In this case, you can have a front buffer being displayed while the other two back buffers are used in a ping-pong fashion. At the vblank, the OS can choose to use the most recently fully rendered frame. Windows chooses not to do this for the exact power reasons described above. The advantage of doing it this way is you reduce a minor amount of latency in exchange for keeping your GPU pegged at 100% utilization.
nathanddrews - Friday, October 24, 2014 - link
Since I have v-sync usually set to 96Hz, 120Hz, or 144Hz, I guess I never realize the power-saving benefits.