Revile Wrote:What relation does server FPS have to tickrate?
You'll soon get lots of answers from people who don't know the relation. They say good FPS is good for hitregistration. Someone says 1000 FPS is good for hitregistration. Someone says
stable 500 FPS is better than 1000 FPS for good hitregistration. Then someone claims that tickrate 66 is better than good FPS. Nobody actually knows the
*relation*. Some might say some general blabber about kernel HZ.
I've been trying to solve the FPS/tickrate relation. I'm still in the process.
I'll post some preliminary thoughts about the thing.
Every "tick" consists of certain amount of frames. The more frames there are, the more accurately the server can calculate where bullets fly and where players are. The more ticks there are, the more information players get from the server, thus having more precise perception of the game situation.
The FPS
*will fluctuate* no matter what. Here are two graphs generated by
srcds plugin for Munin. In the first picture the FPS is limited to ~500 (fps_max 600) and in the second it's not limited at all (fps_max 0).
Limited to 500 FPS (fps_max 600)
fps_week.png (Size: 37.67 KB / Downloads: 129)
Unlimited FPS (fps_max 0)
srcds_fps-week2.png (Size: 30.5 KB / Downloads: 134)
It's the same server in both graphs.
In the graph below you'll see that the FPS fluctuates between 600-900 (roughly). In the upper graph you'll see that the FPS, when on the same server but only limited by fps_max 600, fluctuates between 350-500 (roughly).
The big question is why does the FPS drop so much below 500 FPS when being limited by fps_max. If there is no limitation, then the FPS is almost all the time over 500 FPS. Why does the FPS go below 500 FPS when it's limited? Without limitations the FPS is very much over 500 FPS, so why doesn't the FPS go stable in 500 FPS when it's being limited to 500 FPS? Common sense says that the server has even more CPU power to spare when the FPS is limited to 500 FPS, because the server doesn't even try to go over 500 FPS. Something's not right in the server. I believe this happens on
*every* server in the world.
I hope I can get volunteers from this forum to stat their 1000 FPS servers. With more statistics we could determine whether it is possible to achieve stable FPS on any server. I've got only one server where there are all kinds of stuff running. They might be affecting the test results. PM me on this forum if you want to get (FPS) statistics from your server.
--
About the 1000 Hz thing.
It seems the 1000 Hz thing has very little to do with server FPS. I'm running 100 HZ kernel now, but I get near to 999 FPS.
There are patches to Linux kernel which allow high resolution timers with non-1000 Hz kernels. The Hz is used to distribute computing time in the system.
I haven't completely finished my research about what kernel Hz is and how timers work in the kernel. The problem (at least in some programs) is that it's impossible to execute "sleep command" for shorter times than 1 Hz - unless the kernel has high-resolution timers... or something like that.
I think the initial idea in the 1000 Hz thing is that the kernel wouldn't have run some other program for 10 ms when the game server would've needed some processing. That's why the 1000 Hz makes it possible for the server to interrupt any other program in 1 ms interval and start processing by itself. Anyways, I don't think this is true in the new kernels.
Note: The latter part of this post is partly speculation.