GPUs are becoming more powerful with each generation, but that performance comes at the cost of a big footprint, high power draw, and high thermal output. On top of that, mid-tier and above cards cost a small fortune, and to keep them running cool, you have to invest more money in a larger PC case or powerful 140mm case fans.
So, PowerColor decided to integrate a Neural Processing Unit (NPU) within their GPUs, which aims to lower the power draw by 20%, but you can’t have your cake and eat it too. This is because, like configurable TDP in CPUs, using NPUs could also lower performance output of your hardware.
How would the AI NPU chip work in PowerColor GPUs?
TUL owns PowerColor, and they decided to leverage the AI NPU chips developed by Kneron and integrate them into the Hellhound and Red Devil GPU series. So, future GPUs from PowerColor will include this technology, which Kneron claims adjusts the GPU performance using an AI model, resulting in high performance at a significantly reduced power draw.
This should also help preserve the components, as high temperatures can damage the GPU, especially if it’s been running in that state for a long time, and reduce its lifespan. In addition, this gives the GPU two modes: ECO and Boost. In the ECO mode, the temperature is detected, and the fan curve is adjusted according to that, which, Kneron claims, results in 20% less power consumption.
The Boost mode follows the same concept of adjusting the energy consumption while delivering high performance and stability.
Is this something groundbreaking and what is the catch?
The concept of reducing power consumption while delivering top-notch performance isn’t something new, and we’ve seen other implementations of it. AMD Radeon Chill is a prime example, as it adjusts the FPS depending on in-game motion and lowers it if there’s no motion, reducing the energy the GPU consumes.
However, PowerColor did this on a physical level by integrating the AI NPU chip. We saw some real-world demonstrations of this, and the results raised more questions.
In Final Fantasy XV, the GPU had a temperature of 51°, a power draw of 338W, and delivered 118 FPS without the use of NPU technology.
However, upon enabling the NPU technology, the power draw dropped to 261W, a difference of 77W, which is huge. However, the temperature increased by 10°, and the FPS dropped to 107.
So, their claim of lower power consumption was well highlighted, and we were prepared to see a hit on the FPS. What caught us by surprise was the temperature increase because if less energy is being used, it should also result in a lower thermal output.
While we can say for sure why this happened, we’ll have to wait for the official release from PowerColor. Maybe this is a bug that they’ll iron out before launch, or maybe it’s too soon to release this technology, as it could benefit from more optimization.