SRCDS Steam group


Real FPS doesn't match fps_max value
#1
I've used tickrate enabler to enable 100 tick for my server. And according to the statement on fptsmeter.org, one of the golden FPS values for 100 tick is 100.

But it seems I can never get the real fps to 100.

When I set fps_max to 100, the real fps will be stable at around 85. When I set it to anything from 101 to 127, the real fps will be statbe at around 102. When I set it to anything from 128 to 163, the real fps will be stable at around 164.

How should I make the real fps stable at 100?

My dedi runs Win 2003 server.
Reply
#2
The fps_max value sets the actual dps abit offside compared to the actual value. The only thing you could do is to play with it abit. I'd say that within a margin of 5, there should be no noticable difference.
(fps_max 112)
Reply
#3
The golden values? LOL.. Just try sticking with stock settings, nobody I know of has any problems with it running stock. As long as your tickrate and fps are about 1:1 it's fine..

http://leaf.dragonflybsd.org/~gary

“The two most common elements in the universe are hydrogen and stupidity.”








Reply
#4
no, actually having fps slightly higher than your tickrate is bad. in that case it might be better to have considerable higher fps, like 300. than the actual fps are more or less unimportant.

and sorry, I have no experience with srcds on windows, so I cannot actually tell you how to make them more stable. the only think I know: have you enabled HPET in the BIOS?
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#5
Maybe you can clarify on it being bad. Nobody has mentioned this on hlds_linux@, and the some/a majority/most users on that mailing list have a clue, so if there was an issue with it, it probably would've been mentioned.
http://leaf.dragonflybsd.org/~gary

“The two most common elements in the universe are hydrogen and stupidity.”








Reply
#6
Because of the aliasing effect. Imagine, server fps and tickrate are both 66.666... fps, that means, every 15 ms a frame and a tick will be "rendered". Remember, a tick can only be rendered during a frame. If both rates are exactly matching, then during every frame the next tick will be rendered. That's perfectly fine.

Now, the tickrate is fixed at 66.666... Hz, so the engine will always try to keep the average time between to ticks 15 ms. If now the time between two frames is a little shorter than 15 ms (i.e. FPS are higher than 66.666...), say 14 ms, the engine must omit calculation of a tick during some frames, to keep the tickrate at the desired value. So in this examle, every during every 15th frame no tick will be calculated, only during the other 14 frames. That means, the time between two ticks is 14 times 14 ms and then one time 28 ms.

If you are lucky and have a large enough lerp value, you will probably not really notice that (tough it's still probably not optimal). But if you have your lerp value tight to the lower limit, your client will run out of snapshots every 15 ticks and has to extrapolate. That is most certainly bad, as extrapolation cannot be exact. It might still remain unnoted on public servers, where people usually are not paying so much attention. But it can mean the difference between a headshot and a miss...
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)