Clevo P750ZM: GTX 980M Overclocking Investigated
by Jarred Walton on March 20, 2015 10:00 AM ESTClevo P750ZM OC Test Setup
Armed with both a fully unlocked VBIOS and the 344.75 NVIDIA drivers, it’s time to investigate overclocking. That means we need some overclocking software, so we checked out MSI Afterburner 4.1.0 and NVIDIA Inspector 1.9.7.3. Both offer the ability to change GPU core/RAM clocks, but NVIDIA Inspector in this case allowed us to change the GPU voltage as well, which made it the winner for our testing.
Both pieces of software work with offsets, so you’re not directly changing the clock speeds but rather increasing (or potentially decreasing) the starting point. The GTX 980M by default runs at 1038MHz plus whatever Boost clock the hardware can “safely” reach; the GDDR5 by default runs at ~5000MHz. At idle, the 980M has a 135MHz core clock and 650MHz GDDR5 clock, and it’s important to note that overclocking did not change these idle clocks.
It’s worth noting that besides shipping with a Prema modded BIOS/VBIOS, Eurocom also uses IC Diamond 7 Carat thermal paste for their P5 Pro (P750ZM) notebook. This comes installed from Eurocom and you don’t (directly) pay extra for the improved cooling capability. If you purchase a P750ZM from a different vendor, you will likely want to either pay the system integrator to use a better thermal paste or else plan on doing the upgrade on your own. It might be possible to improve the cooling by lapping the heatsinks or using a different thermal paste, but we didn’t investigate this.
For our benchmarks, we ended up testing five different settings: stock clocks, +135MHz core and +250MHz RAM, and +250MHz core and +400MHz RAM with a +50mV voltage bump as well. The +135MHz clock is what you’ll be limited to with a “normal” VBIOS while the +250MHz result was about where our GPU capped out before becoming unstable. In fact, for heavier loads, we found instances where +250/+400 would crash the NVIDIA drivers (or even crash the system), so for our full stress testing (see page four) we backed off to +225/+350. We also set the fan speed to 100% (Fn+1 is the keyboard shortcut on the P750ZM) for our maximum 250/400 and 225/350 overclocks (though we also tested 225/350 without bumping up the fan speed).
As far as our test software, we selected seven games, with several coming from our regular mobile gaming benchmarks along with a few recent releases. Instead of going with the native 4K (3840x2160) resolution of the laptop panel, we opted for a more moderate resolution of 2560x1440. This allows us to ratchet up the quality settings more than we’d be able to at 4K, which means the CPU has to work harder and the GPU should still have plenty of work going on as well.
Finally, we also conducted stress testing where we attempted to “max out” the total system power draw. To accomplish this we ran the second pass of the x264 HD 5.0 test on four threads (cores 4-7) with Tomb Raider running at 2560x1440 Ultimate settings on four threads (cores 0-3). We found that using more cores/threads for x264 ended up reducing the total system load as the game would end up running slower due to fighting for resources. Note that certain synthetic tests (e.g. FurMark) will throttle the clock speeds automatically, which is why we opted for real-world applications.
34 Comments
View All Comments
zodiacfml - Saturday, March 21, 2015 - link
Nah, this is no way advisable as longevity and reliability will be sacrificed. It's rare that notebook cooling have been over engineered and mobile GPUs are notorious for deteriorating solder joints which can't be permanently fixed cheaply.smilingcrow - Saturday, March 21, 2015 - link
"Assuming 85% efficiency on the power bricks the 230W AC adapter would be very close to 100% load."I imagine you mean that it's rated for 230W DC output otherwise it would be out of spec as in testing you hit 265W AC. That's ~225W DC output.
waldojim42 - Tuesday, March 24, 2015 - link
I seem to be missing something, who gets power supplies rated based on input wattage, and not output? EVERY laptop power supply I own is rated on the power supplied to the machine. It makes no sense to me to rate the input.radeonex - Saturday, April 11, 2015 - link
I noticed that in this article, for the stock configuration, the GPU temp for the P750ZM stayed below 70 C roughly. However in the full review of the P750ZM, the GPU temp that was shown hovered around 75 C. Can you please comment on the discrepancy?