Well if the card is rated up to 105°, then you probably won't start to see artifacting or glitches until you start approaching 100°, I'd imagine. 93° is 12° below the max stated temperature, so in theory the card
should be okay, but the rest of the computer might not
Quote:
Originally Posted by Arclight
14A? That's not enough. 14*12= 168Watts on the 12v line. The card on its own, according to Nvidia, draws 150W max. That leaves 18W for everything else.
At the very least, that unit will die soon since it's running at 100% all the time. If it's a cheap unit (it is if that Amperage is accurate), it's possible it'll take out everything else along with it. Might even burn down the house.
|
SORRY I'm taking the thread a little off topic, but this got me thinking... I'm looking at getting a GeForce GT 430 (49W max power according to nVidia, max usage≈85W according to Guru3D), and I'm looking at
this 300W power supply, which, if I'm reading the specs right, has 32 amps total on both 12v rails (16 on each). Using your equation, (16*12)+(16*12)=384W on both rails. 384W-85W≈300W left over for everything else. I'm running an HP slimline, stock parts except for upgraded RAM, not sure what the power consumption for all the parts is. nVidia recommends a minimum system power of 300W, but according to the Guru3D review, their according-to-them hefty system maxed out at 242W with the GPU at full load

Just wanted to see what you thought.
/hijack