1. Trouble with the game?
    Try the troubleshooter!

    Dismiss Notice
  2. Issues with the game?
    Check the Known Issues list before reporting!

    Dismiss Notice
  3. Before reporting issues or bugs, please check the up-to-date Bug Reporting Thread for the current version.
    0.35 Bug Reporting thread
    Solutions and more information may already be available.

15FPS Nvidia 770 + AMD 965 Quad

Discussion in 'Troubleshooting: Bugs, Questions and Support' started by JDMClark, Dec 15, 2013.

  1. moosedks

    moosedks
    Expand Collapse

    Joined:
    Nov 4, 2012
    Messages:
    1,113
    I don't know if that debug screen gives the fps you were looking for. Try ctrl+f
     
  2. SixSixSevenSeven

    SixSixSevenSeven
    Expand Collapse

    Joined:
    Sep 13, 2013
    Messages:
    6,958
    It doesn't, but you can calculate it from the frame delays in the text above which is always correct, graphics step gives about 26fps in his case.

    If you factor in physicsStep and debugStep too then its only 9fps. I haven't done any tests myself to see whether these need factoring in to get the real FPS from the times given on debug or not.
     
  3. DaZack

    DaZack
    Expand Collapse

    Joined:
    Aug 10, 2013
    Messages:
    5
    Yes, I was speculating as to say that the values were accurate at all. From my own observation, the fps displayed on the ctrl+f screen and the first debug screen are negligibly different. I just used the debug screen in the screenshots because it pushes all the information to the edges of the screen, whereas the ctrl+f menu is a white overlay that makes it harder to fit what you want in a shot. Purely a decision based on aesthetics; stupid, I know.

    Regardless, whether players are getting 26 fps, 25 due to round off error (or vice versa), or whatever , the difference is negligible. All I wanted to get across was that the old CPU is capable of smoothly running this new game.

    Off topic: I am interested in what the value on each screen is calculated from. On the bottom little bar in the first debug screen, the FPS value usually reads something near 2000. Once slow mo is engaged, this value drastically decreases as the Graphics Step ms increases. The final FPS goes up to 1200-ish.
    What does this mean? How does a program like Fraps (which, once again, displays a negligible difference in FPS) calculate its own value?
    Thank you for your kind responses.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice