ToMI likes to make my video card overheat
It seems to be something about the way the engine handles transparencies, that makes my video card temperature go straight into the stratosphere. Especially the island map screens on either game, the GPU temperature rockets straight to 80 degrees C, and slowly increases towards a fatal overheat from there. Reducing the resolution doesn't even help. Reducing detail works, but it has to be cranked clear down to 3 to avoid overheating.
The only other game that does this is STALKER, which I can only play in the winter with the heat off. Every other game I have consistently stays under 80 degrees, even Fallout 3 with every single graphics setting maxed (of course, the framerate is unplayable then).
I'm running Windows XP x64, and I'v got an Intel Core 2 Duo 6320 at 1.86 GHz, 2 gigs RAM, and a GeForce 9800 GT. I was running 185.85 of the drivers, and just updated to the latest ones (190.62). Still having this problem.
The only other game that does this is STALKER, which I can only play in the winter with the heat off. Every other game I have consistently stays under 80 degrees, even Fallout 3 with every single graphics setting maxed (of course, the framerate is unplayable then).
I'm running Windows XP x64, and I'v got an Intel Core 2 Duo 6320 at 1.86 GHz, 2 gigs RAM, and a GeForce 9800 GT. I was running 185.85 of the drivers, and just updated to the latest ones (190.62). Still having this problem.
This discussion has been closed.
Comments
That sounds extremely weird. Maybe you should think about the airflow in your computer case. And about cleaning the inside of your case from dust from time to time. Especially all the fans.
What kind of cooler do you have on your card? the stock one? please don't tell me it's completey passive with some zalman-thingy or whatever....
And like I said, it's game-dependent, not hardware. At the one extreme, Fallout 3 with every single thing maxed out runs at about 15 fps, but is about 73-74 degrees with indoor scenes, 75-77 outdoors. Oblivion is about the same. Any settings, the temperatures are constant, with only the frame rate changing.
Then at the other extreme, STALKER, like ToMI, has to be set down to really horrible graphics to avoid overheating.
Source engine games (HL2, L4D, etc.) are in the middle. If I max out everything, it does about 50 frames per second, but overheats occasionally. If I leave everything maxed except anisotropic filtering and antialiasing (2x anisotropic, 2x MSAA), it runs at about 100 fps, and doesn't overheat at all.
Anyway it would be interesting to see how your GPU and its thermal solution handle high load. There is a stress test you could run:
Download FurMark to stress test your graphics card. Start it, select "Stability Test", "Xtreme Burning Mode" and "Log GPU Temperature". Now start the test by pressing "Go!".
While running you should be able to monitor the temperature at the bottom of the screen. Run it for 10 minutes or until the temperature seems to be stable for some minutes. If the temperature exceeds 80°C, abort with Escape.
The log file created is called "gpu-temperature.xml". You have two options to make it accessible to us. Pick the one that works best for you:
This was set to windowed, and a custom 1600x900 resolution, and 2x MSAA, since that's what I set all my games to.
Apparently I need more fans or something. There's a spot for an 80mm to blow right on the video card, and another in the front for the hard drives. Maybe I can have one blowing in from the side, and the fan at the front blowing out, as that's the direction the video card's fan seems to blow in.
I think the heatsink is not sitting properly on the GPU. You should reseat it. I think perhaps you should get a professional to do it for you. There's no reason for the raise in temperature if the heatsink was sitting properly on the GPU.
I forgot to mention, this card isn't overclocked at all, not by XFX, and not by me. 100% nVidia clock speeds.
But anyway, it is still odd that ToMI should stress the GPU more than Fallout 3 and Oblivion.
That's definitely bad. I have a passively cooled Radeon 4870 and it takes 3 Minutes from 53°C to 80°C with FurMark, even with my system fans on a very low noise level.
That sounds reasonable.
At least when it's just showing the maps.
Then I fire up Fallout 3, and the highest temperature I can make it get to, by fighting stuff near water (at 10 fps), is 81... Crazy. I guess Bethesda specifically programmed their engine to not stress video cards. Might explain why there are Oblivion hacks that seem to increase FPS for "free" (i.e., without changing any of the graphics).
I assume 3D action game developers focus their graphics engine development much more on efficiency.
The opening of ToMI chapter one is probably the most graphically demanding scene in the series so far. In this kind of game it would be reasonable to focus on getting those working smoothly, from an development effort point of view.
From the bit of OpenGL programming i've done i've learned one thing:
It is easy to get 3D Graphics working at all. Making it efficiently is a whole different story.
That still must be one inefficient transparency engine, though. The opening with the voodoo lady and all the smoke raises the temperature, too.
That still sounds really high.
New FurMark results would be more interesting for comparison.
Do you have any link to the exact model of your graphics card on the manufacturers website?
I think Fallout 3 and Oblivion simply need a lot more CPU time than the comparably simple ToMI. So the maximum framerate is limited by your CPU. ToMI on the other hand doesn't do so much on the CPU, so there's time to calculate additional frames.
(Just a theory)
Now I'm going to have to wait like a week to see if I can work past the achievement bug.
Also, the CPU thing is an interesting theory, but nVidia monitor shows the CPU load at about 70% when playing Fallout 3.
But geeze, I still think there's some kind of huge problem with the transparency engine in ToMI. I'll post some temp logs from nVidia monitor in a few minutes.
First, ToMI. Started out with the GPU at 50 degrees, at the desktop, fired up Siege of Spinner Cay, loaded a save game that was next to the raft, got on the raft, and let it sit for awhile. The screensaver turned on at 16:33:8, oops, so I halted testing then. It ended up close to 10 minutes. The GPU was about 55 when I got to the map screen
Settings: 1600x900 windowed, detail 9.
Next up is Fallout 3. I'm going to find a spot with some nice water, save, quit, let the GPU cool to 50 again, and repeat the test.
Settings: 1680x1050 full screen (for some reason, it won't let you select 1600x900 windowed!), ultra detail, vertical sync, HDR, 4x AA (I could've sworn I selected 8, but it was apparently 4), and 15x anisotropic filtering.
Okay, here's what I'm going to be staring at for about 10 minutes.
http://img.photobucket.com/albums/v116/syldssuf/ScreenShot1.jpg
Should be a pretty graphics-intensive scene. Water, ripples, architecture, trees, full reflections. Hopefully god mode keeps you from dying due to radiation poisoning.
Must have read the clock wrong, that was a little short. Still, it's fairly conclusive data.
ToMI's map screen took the GPU from 60 to 70 degrees C in 72 seconds. Fallout 3 took 160 seconds for the same temperature change.