Technological feat aside:
Revolutionary heat dissipating coating effectively reduces temperatures by more than 10%
78.5C -> 70C = (78.5 - 70) / 78.5 = 0.1082 = 10% right?!
Well, not really. Celsius is an arbitrary temperature scale. The same values on Kelvin would be:
351.65K -> 343.15K = (351.65 - 343.15) / 351.65 = 0.0241 = 2% (???)
So that’s why you shouldn’t do % on temp changes. A more entertaining version: https://www.youtube.com/watch?v=vhkYcO1VxOk&t=374s
GamersNexus presents their temperature testing in terms of difference from room temperature, so this is probably how they’d do this comparison.
I’m not sure they’d see a reason to cover ram temperature unless it was approaching actual risk of harm or enabled higher clocks, though. Comparing cases or CPU coolers by temperature makes sense. Comparing GPUs when they’re using the same chip and cooling performance is a big part of the difference between models? Sure. But RAM? Who cares.
I mean I also don’t really care about the temperature of my ram unless it prevents it from working. RAM overclocking isn’t that useful, and unstable ram sucks ass.
However, it doesn’t matter what the component is: the original difference over ambient is the amount of heat that operating the component generated. The difference after cooling is essentially the amount of heat that the specific cooling solution was able to handle. No matter the component. And dividing the latter by the former gives out the amount of cooling the cooling solution provided, relative to the amount of heat operating the component generated. This works for any component and any cooling solution. Cooling it further than ambient can be desirable for some use cases, that’s why chillers exist, and that will essentially give out a percentage over 100.