Like I said on page 1, the best thing to do is run FRAPS and see if the framerate bottleneck is between quality 6 and 7. If so, it appears (to me anyway) to be the dynamic shadow generation.
I turned them off an deactivated the option, that other programs may turn them on or off.
Nvidia or ATI?
I never heard of such function (Nvidia), just of being able to specifiy for each and every program uniquely (which is more efficiently than the other way around, which is seemingly what you did?). Is that what you did?
I never heard of such function (Nvidia), just of being able to specifiy for each and every program uniquely (which is more efficiently than the other way around, which is seemingly what you did?). Is that what you did?
Still, wouldn't it be more efficient to use the options for indivual programs to overwrite *only* for Sam&Max (Not sure they are included yet, but ToMI certainly is)?
Setting it's CPU affinity to one core via Task Manager is your friend, and if that's not enough I'm pretty sure you can use BCDEdit to tell Windows Vista/7 to use only a certain number of cores...
If I want an accurate benchmark, then neither is my friend - I'm not even sure if the second is possible.
If I don't want an accurate benchmark, then I don't want a benchmark
But still - I'm very sure TPZ wouldn't even use 25% of my quad core most of the time if it wasn't waiting for the vsync to happen, as that is a loop inside of Direct3D that polls the graphics hardware and thus produces 100% load on the core it's running on.
Most probably true - but that's a modern CPU, I wonder what would happen on ages old single cores.
Like I said on page 1, the best thing to do is run FRAPS and see if the framerate bottleneck is between quality 6 and 7. If so, it appears (to me anyway) to be the dynamic shadow generation.
Read this thread with interest, as I did the ones here and here.
I've a Quad Core Q6600, 4GB of DDR2 RAM and an Nvidia 768MB 8800GTX all running under x64 Windows 7.
My GFX card is a little dated but I wouldn't expect it to struggle with Devil's Playhouse on max settings. Having said that, it does, and it's impairing my enjoyment of the game.
At random points between scenes (normally when switching to a dialogue scene), the framerate will drop from 50-60 fps to around 20fps. The action then continues like this for some time before eventually returning to 50-60 fps.
I have tried lowering the resolution to no avail, whether I'm running in 1680x1050, 1360x768 or 1280x768 the problem is still evident.
One strange thing I've noticed is that I can normally force the framerate back up to 50-60 fps by switching to my desktop using Alt+Tab and then switching back to Devil's Playhouse. The play then continues at 50-60 fps until the next time I encounter the problem and it drops back down to 20 fps.
As observed by Hollow Man, the problem ceases to exist if I turn off shadows/shading by dropping the GFX quality to '6'.
Whilst this presents a workaround, I'm struggling to understand why my rig struggles so much at the higher graphical settings. I suppose it may just be inadequacies with my GFX card but the Alt+Tab thing makes me suspicious and I can't help but think some aspect of the game engine may be contributing to the problem.
I notice this thread is void of any official statement from Telltale and it's also pretty old? Is there a more recent thread where these discussions were continued?
Have searched around but this seems to be the most relevant thread I can find!
I'll have to try the alt-tab trick. If that temporarily solves the problems, and it doesn't somehow disable the shadows, then I can't purely point to the graphics card and say, "It's having trouble rendering dynamic shadows."
For what it's worth, bpullen, I have almost the same rig. Only difference is a dual core instead of quad.
At random points between scenes (normally when switching to a dialogue scene), the framerate will drop from 50-60 fps to around 20fps. The action then continues like this for some time before eventually returning to 50-60 fps.
This may well be a driver issue. The latest Forceware drivers haven't been particularly kind to older nVidia cards, among other possible causes.
Hmmm, you know everythime the dialogue wheel pops up the framerate drops significantly. There is something in that appearing animation that is very demanding. Even worse if it has a memory leak ...
Conclusion; it probably isn't an issue with your drivers or PC, the game just has an issue there.
Hmmm, you know everythime the dialogue wheel pops up the framerate drops significantly. There is something in that appearing animation that is very demanding. Even worse if it has a memory leak ...
Conclusion; it probably isn't an issue with your drivers or PC, the game just has an issue there.
It doesn't happen on my machine, so it may not be a game issue.
There's so much more that can be going on it's hard to even know where to start. You can't diagnose something like performance purely based on a hardware profile.
Comments
I turned them off an deactivated the option, that other programs may turn them on or off.
Somehow that didn't work.
-HM
I never heard of such function (Nvidia), just of being able to specifiy for each and every program uniquely (which is more efficiently than the other way around, which is seemingly what you did?). Is that what you did?
I've never altered any settings with my hardware so the only way they could be shut down is if they're done so by default.
If I want an accurate benchmark, then neither is my friend - I'm not even sure if the second is possible.
If I don't want an accurate benchmark, then I don't want a benchmark
Most probably true - but that's a modern CPU, I wonder what would happen on ages old single cores.
np: Brock Van Wey/Bvdub - Will You Know Where To Find Me (Pop Ambient 2010)
Well, I found this - surely I wouldn't use this switch to analysis purposes :eek:
Read this thread with interest, as I did the ones here and here.
I've a Quad Core Q6600, 4GB of DDR2 RAM and an Nvidia 768MB 8800GTX all running under x64 Windows 7.
My GFX card is a little dated but I wouldn't expect it to struggle with Devil's Playhouse on max settings. Having said that, it does, and it's impairing my enjoyment of the game.
At random points between scenes (normally when switching to a dialogue scene), the framerate will drop from 50-60 fps to around 20fps. The action then continues like this for some time before eventually returning to 50-60 fps.
I have tried lowering the resolution to no avail, whether I'm running in 1680x1050, 1360x768 or 1280x768 the problem is still evident.
One strange thing I've noticed is that I can normally force the framerate back up to 50-60 fps by switching to my desktop using Alt+Tab and then switching back to Devil's Playhouse. The play then continues at 50-60 fps until the next time I encounter the problem and it drops back down to 20 fps.
As observed by Hollow Man, the problem ceases to exist if I turn off shadows/shading by dropping the GFX quality to '6'.
Whilst this presents a workaround, I'm struggling to understand why my rig struggles so much at the higher graphical settings. I suppose it may just be inadequacies with my GFX card but the Alt+Tab thing makes me suspicious and I can't help but think some aspect of the game engine may be contributing to the problem.
I notice this thread is void of any official statement from Telltale and it's also pretty old? Is there a more recent thread where these discussions were continued?
Have searched around but this seems to be the most relevant thread I can find!
For what it's worth, bpullen, I have almost the same rig. Only difference is a dual core instead of quad.
-HM
This may well be a driver issue. The latest Forceware drivers haven't been particularly kind to older nVidia cards, among other possible causes.
Conclusion; it probably isn't an issue with your drivers or PC, the game just has an issue there.
It doesn't happen on my machine, so it may not be a game issue.