PC Lag!

Do you find that the lag in this episode is a bit... unusual? I can play GTA San Andreas or Spore or The Sims 3 on maximum settings, 1080p and anti aliasing enabled but anything above the number 6 setting on this game, it begins to lag! My computer should more than powerful enough to run this game (quad core CPU, 1gb geforce 9600GT, 4gb ram)! Or is it just lagging on Windows 7 64bit?

Apart from the lag and a few sound issues, this was one of the best episodes ever! (Second only to Night Of The Raving Dead)
«1

Comments

  • edited April 2010
    Ehm, I got a GeForce FX9600GT too, and it's only 512MB, not 1GB as you state.

    And it's kind of a weak point, I suggest getting a more modern video-card because it's definitely holding back (bottlenecking) your PC.
  • It is? Bummer... guess it's something for the 'Save Up And Get ASAP List' then.
  • edited April 2010
    I was surprised by that too.
    The game runs very smooth till quality level 6, but begins to lag at level 7. That's wneh the advanced light effects are put on. It really makes a visible difference.
    I'm playing on a Laptop, so a lag wouldn't be such a big surprise.
    but I didn't experience issues like that in tales of monkey island.
    (I can also play bioshock 2 for example.)

    oh well, I guess there's nothing I can do about it (except play the game on a better PC)
    I would have bought it for PS3 instead, if I head expected that.
    But still. The game still looks good on level 6.
  • edited April 2010
    I don't think the problem is your video card, OP.
    Since I have two monitors hooked up to my PC, I can have Taskmanager open on one screen, and play the game on the other. I noticed the game doesn't use more than 50% CPU, and since I have a dualcore CPU, that would mean the Telltale engine doesn't support multithreading. This would mean the game only uses one of your four cores, which would be a good explanation of why this game would run a bit slow, while other games don't.
  • edited April 2010
    you're right abot that.
    I just tried it, while running the game in widow mode.
    It's never more then 50%.
    Are you having the lag problem too?
    However, I still would think, that advanced graphic options (like shaders and so on) are more dependent to the video card.
  • I figured out that its the small shadow details starting from Visual Level 7 that is causing the lag. Doesn't matter what resolution, the shaders just make it lag so much.

    You're definitley right about the multithreading support thing. My old computer it almost maxes out the CPU and runs slightly better (WTH!) with a 8600GT card and a Pentium 4 CPU.

    Weird.
  • edited April 2010
    The OP's card is definitely contributing to it. 9600GT's are low end now.
  • Yeah I know it's pretty low end, but it was the best I could afford right now.
  • edited April 2010
    <shrug> I didn't notice any lag, and I had it on 1900 x 1200 with the graphics set to no.9.
  • edited April 2010
    stemot wrote: »
    The OP's card is definitely contributing to it. 9600GT's are low end now.

    It should still be able to run the game at level 9. Maybe not at 60fps all the time, but it should work comfortably.

    And btw. Hassat Hunter, there are indeed 1GB 9600GTs around.
  • edited April 2010
    I have a Geforce GTX-275 and am able to run the game flawlessly at 1680x1050 resolution graphic quality 9, for what it's worth.
  • edited April 2010
    It should still be able to run the game at level 9. Maybe not at 60fps all the time, but it should work comfortably.

    And btw. Hassat Hunter, there are indeed 1GB 9600GTs around.

    Well my old 9600GT had framerate problems with ToMI on 9 and with TDP having more advanced lighting and shadowing shaders it stands to reason that he's going to suffer lag.
  • edited April 2010
    I just tested the multithreading thing on wallace and gromit and on tales of monkey island.
    all of them are never usin more then 50% CPU.
    However, there isn't any lag while playing those games, even on the highest quality level.
    (I've got an ATI Mobility Radeon HD 4570 btw.)
  • edited April 2010
    stemot wrote: »
    Well my old 9600GT had framerate problems with ToMI on 9 and with TDP having more advanced lighting and shadowing shaders it stands to reason that he's going to suffer lag.

    It's funny, because when Tales came out, I also had a 9600GT - the game run like a breeze on level 9. Wait. Wasn't your card equipped with DDR2 memory? This could be the problem for the OP also, I almost forgot such variants existed.
  • edited April 2010
    It should still be able to run the game at level 9. Maybe not at 60fps all the time, but it should work comfortably.
    Yeah, I know... because I do just that with the same card and a single core and 2GB RAM (of the old DDR1 kind).
    But it does goes noticably slower on the wideshots of the street... but I can't be too bothered, since that's just 2 screens max that ain't smooth.
    Then again, I do use Win XP, which is less intensive memory wise than 7 from what I hear...
    And btw. Hassat Hunter, there are indeed 1GB 9600GTs around.
    There are?
  • Yep, there are 1GB versions of this card. Thats what is says under the performance infomation and tools.

    It's starting to work better though, like the more you play it the slightly smoother it gets. But I just turned down the detail to 6 and I can't see any huge difference so i'm fine with it.
  • edited April 2010
    What resolution are you running at? I'm using an NVIDIA 8800 GTS 320 MB, and have no problem at quality 9 at 1680x1050. I can tell the framerate drags a bit at times, but it's nothing so awful that I feel the need to turn the graphics quality down.

    Your card card should come withing 5-10 fps of mine at the same resolution, if I had to make a guess at it.

    EDIT: I just ran Fraps and it can dip below 30 FPS for me at times, but outside it tends to hover at 30-35 FPS, and inside around 45-60 FPS (it appears to be capped at 60 FPS).

    Putting the quality down to 6 turns off the dynamic shadows, and that makes the game peg at 60 FPS for me no matter where I go.

    -HM
  • i am running at 1920x1080.
  • edited April 2010
    Yeah, so the higher resolution and slightly less powerful video card is going to make it drop into the 20s a bit, if I had to guess. Best thing to see what's going on is to install and run Fraps and see what your FPS is.

    Or run it at quality 6 and be done with it. A game like this doesn't need uber-fancy dynamic shadows, especially ones that are poorly implemented. Much older FPS games with dynamic shadows, such as Doom 3, run and look better. And that game has dynamic shadows from tons of objects, moving and static. In S&M, it seems to only come from the characters (but I haven't stared enough at it to be sure).

    -HM
  • edited April 2010
    Yeah, I know... because I do just that with the same card and a single core and 2GB RAM (of the old DDR1 kind).
    But it does goes noticably slower on the wideshots of the street...

    It may be your CPU then - or the RAM, depending on the speed. 400MHz DDR1 RAMs are just fine :)
    Yep, there are 1GB versions of this card. Thats what is says under the performance infomation and tools.

    I'm curious now - can you check if it has DDR2 or gDDR3 memory?
  • edited April 2010
    Performance in this game is little dodgy. My computer is well equipped to run this game at max detail, but there's some parts that have frame rate issues. It's usually turning the camera around to reveal another piece of landscape.

    CPU:Intel I5-750 Quad Core
    Ram: 4gigs of Gskill ddr3 1600
    Vid: HD5850 with 10.4 preview drivers
    OS: Windows 7 64 bit Ultimate

    I'm not sure if it's a driver issue, but every other game I own, I can run it max @1080p, and usually have no frame rate issues.
  • Pretty sure it has gDDR3 memory.
  • edited April 2010
    Pretty sure it has gDDR3 memory.

    Then beats me why you're having slowdowns :confused:
  • I know, it's just weird. Maybe the Telltale Engine is just poorly coded for quad core CPU's or something like that.
  • edited April 2010
    The TTT is coded much like a webbrowser: i.e. there's a scene DOM that's dynamically modified using Javascript-sibling LUA (the languages are identical at every level except the parsing) with a garbage collection dynamically unloading resources based on the current system load. It's not really the most effective approach and can't compare to direct native code, but it allows TTG to develop games a lot faster and cheaper and focus on artwork and story instead.
  • edited April 2010
    I know, it's just weird. Maybe the Telltale Engine is just poorly coded for quad core CPU's or something like that.

    Nah. I played ToMI on a 9600GT and a Q9550. Now the 9600GT is out and a GTS250 is in. Both cards ran ToMI without glitches, and TPZ is just as fast on the current card. It may be a Win7 driver issue, though.
    The TTT is coded much like a webbrowser: i.e. there's a scene DOM that's dynamically modified using Javascript-sibling LUA

    You mean the 3D objects themselves? Or even the individual vertices? That would be some surprise.
  • edited April 2010
    Not the vertices... well, at lest I hope not: Usually you'd issue a playAnimation command through LUA and then do the actual loading and processing through C. Much like you wouldn't implement a JPEG decoder in Javascript (yes, that's possible).
  • edited April 2010
    Nah. I played ToMI on a 9600GT and a Q9550. Now the 9600GT is out and a GTS250 is in. Both cards ran ToMI without glitches, and TPZ is just as fast on the current card. It may be a Win7 driver issue, though.
    Usually the "uses a whole core" comes from a game that's not really multithreaded where Direct3D is waiting busily for the vertical blank to happen (so that you won't get tearing when something moves on screen) - I'm pretty sure running the animations and the interface wouldn't even cause half that much CPU usage otherwise.

    And it's not like Telltale's games NEED to be multi-threaded - there's no heavy calculations (like enemy AI and such) going on in the background... and also, with Intel's i5 and i7 CPUs (and AFAIK AMD's future offerings) only using one or two out of four cores actually allows the CPU to run at several hundred MHz more than when all four cores would be used - google for "Turbo Boost" and skip over any Knight Rider-related results that come up... :D

    np: To Rococo Rot - Friday (Speculation)
  • edited April 2010
    Not the vertices... well, at lest I hope not: Usually you'd issue a playAnimation command through LUA and then do the actual loading and processing through C. Much like you wouldn't implement a JPEG decoder in Javascript (yes, that's possible).

    OK, but if only the game management and the top level stuff of the sub-areas is in LUA, then it shouldn't be too hard on the system, even if it's not super efficient. At least theoretically :)
    Leak wrote: »
    Usually the "uses a whole core" comes from a game that's not really multithreaded (...)

    Yeah I know these - it would be interesting to run a benchmark with TPZ on a single and a dual core CPU, to verify if the theory is correct.
  • edited April 2010
    Yeah I know these - it would be interesting to run a benchmark with TPZ on a single and a dual core CPU, to verify if the theory is correct.
    Setting it's CPU affinity to one core via Task Manager is your friend, and if that's not enough I'm pretty sure you can use BCDEdit to tell Windows Vista/7 to use only a certain number of cores...

    But still - I'm very sure TPZ wouldn't even use 25% of my quad core most of the time if it wasn't waiting for the vsync to happen, as that is a loop inside of Direct3D that polls the graphics hardware and thus produces 100% load on the core it's running on.
  • edited April 2010
    My video card is only 256mb and I get no lag whatsoever on graphics setting 9.
  • My video card is only 256mb and I get no lag whatsoever on graphics setting 9.


    May I ask what card you have?

    EDIT: And setting it to a single core helps. Barely. Adds like one or two FPS which doesn't really seem worth it.
  • edited April 2010
    May I ask what card you have?

    EDIT: And setting it to a single core helps. Barely. Adds like one or two FPS which doesn't really seem worth it.

    ATI Radeon 3650 (AGP)

    Other useful specs...
    E4600 Core 2 Duo Processor 2.40gGHz, 800Mhz FSB, 2MB cache
    3GB RAM (800Mhz DDR2)
  • ATI Radeon 3650 (AGP)

    Other useful specs...
    E4600 Core 2 Duo Processor 2.40gGHz, 800Mhz FSB, 2MB cache
    3GB RAM (800Mhz DDR2)

    Odd, my computer tops everything you have but you say you can run the game at level 9 visual detail without lag.
  • edited April 2010
    Could be he has AF/AA shut down by override on the control center.

    That makes a huge difference... in looks, and performance.
    Personally I can't do without AA+AF.
  • Could be he has AF/AA shut down by override on the control center.

    That makes a huge difference... in looks, and performance.
    Personally I can't do without AA+AF.

    I agree with the AA+AF thing. I always activate it when I can.
  • edited April 2010
    OK
    could you please tell me what you are talking about when you mention AF/AA???
  • edited April 2010
    Anisotropic filtering and anti aliasing.

    -HM
  • edited April 2010
    ah OK
    makes sense.
    But how can I deactivate them on level 9?
  • edited April 2010
    The game itself does not allow you to change these options, so you'd have to do it in the video driver. But chances are they're already off.

    -HM
Sign in to comment in this discussion.