SRCDS Steam group


Server FPS and Tickrate
#1
Hi,

This may be a silly question as I'm new to all of this.

I see a lot of talk around of server fps etc but I'm wondering what this number actually refers to. Obviously in a text console it is not referring to frames per second in the normal context.

What relation does server FPS have to tickrate?

I have a server (TF2) that has wild fluctuations in FPS - it is mostly constant at 256, but sometimes it drops as low as 30.

What kind of effect would this huge drop have for players? Bad hit registration?

Any help would be appreciated... Cheers!
Reply
#2
Revile Wrote:What relation does server FPS have to tickrate?
You'll soon get lots of answers from people who don't know the relation. They say good FPS is good for hitregistration. Someone says 1000 FPS is good for hitregistration. Someone says stable 500 FPS is better than 1000 FPS for good hitregistration. Then someone claims that tickrate 66 is better than good FPS. Nobody actually knows the *relation*. Some might say some general blabber about kernel HZ.

I've been trying to solve the FPS/tickrate relation. I'm still in the process.

I'll post some preliminary thoughts about the thing.

Every "tick" consists of certain amount of frames. The more frames there are, the more accurately the server can calculate where bullets fly and where players are. The more ticks there are, the more information players get from the server, thus having more precise perception of the game situation.

The FPS *will fluctuate* no matter what. Here are two graphs generated by srcds plugin for Munin. In the first picture the FPS is limited to ~500 (fps_max 600) and in the second it's not limited at all (fps_max 0).

Limited to 500 FPS (fps_max 600)

.png   fps_week.png (Size: 37.67 KB / Downloads: 129)

Unlimited FPS (fps_max 0)

.png   srcds_fps-week2.png (Size: 30.5 KB / Downloads: 134)

It's the same server in both graphs.

In the graph below you'll see that the FPS fluctuates between 600-900 (roughly). In the upper graph you'll see that the FPS, when on the same server but only limited by fps_max 600, fluctuates between 350-500 (roughly).

The big question is why does the FPS drop so much below 500 FPS when being limited by fps_max. If there is no limitation, then the FPS is almost all the time over 500 FPS. Why does the FPS go below 500 FPS when it's limited? Without limitations the FPS is very much over 500 FPS, so why doesn't the FPS go stable in 500 FPS when it's being limited to 500 FPS? Common sense says that the server has even more CPU power to spare when the FPS is limited to 500 FPS, because the server doesn't even try to go over 500 FPS. Something's not right in the server. I believe this happens on *every* server in the world.

I hope I can get volunteers from this forum to stat their 1000 FPS servers. With more statistics we could determine whether it is possible to achieve stable FPS on any server. I've got only one server where there are all kinds of stuff running. They might be affecting the test results. PM me on this forum if you want to get (FPS) statistics from your server.


--

About the 1000 Hz thing.

It seems the 1000 Hz thing has very little to do with server FPS. I'm running 100 HZ kernel now, but I get near to 999 FPS.

There are patches to Linux kernel which allow high resolution timers with non-1000 Hz kernels. The Hz is used to distribute computing time in the system.

I haven't completely finished my research about what kernel Hz is and how timers work in the kernel. The problem (at least in some programs) is that it's impossible to execute "sleep command" for shorter times than 1 Hz - unless the kernel has high-resolution timers... or something like that.

I think the initial idea in the 1000 Hz thing is that the kernel wouldn't have run some other program for 10 ms when the game server would've needed some processing. That's why the 1000 Hz makes it possible for the server to interrupt any other program in 1 ms interval and start processing by itself. Anyways, I don't think this is true in the new kernels.

Note: The latter part of this post is partly speculation.
Reply
#3
One thing I know is that unstable servers are when most of the players complain about bad registration. Could it be a coicidence though? Sure thing. I can get a decently stable 512 FPS server and what seems to matter greatly (In windows only I presume) is to set affinities. Without affinites, my "512 server" would go down to 476 and jump back up. With affinites, it drops to 511.9...close enough Toungue.

CSS what statistic program do you use? The one you made?
realchamp Wrote:
Hazz Wrote:Has someone helped you on these forums? If so, help someone else
Mooga Wrote:OrangeBox is a WHORE.
Reply
#4
Spartanfrog Wrote:One thing I know is that unstable servers are when most of the players complain about bad registration. Could it be a coicidence though? Sure thing. I can get a decently stable 512 FPS server and what seems to matter greatly (In windows only I presume) is to set affinities. Without affinites, my "512 server" would go down to 476 and jump back up. With affinites, it drops to 511.9...close enough Toungue.
Could it be just affinities? I bet you also have some über fast quad-core server prozerssor maschine which could do 512 FPS up-side down. In this light it doesn't seem right that it's just the affinity that does the fluctuation. How much have you tested about this affinity thing?

(Possibly unrelated, but maybe not...)
I've noticed that many players nowdays are not capable of rendering stable 66+ FPS. That's why they will "lag" on a 66 tick server. They fail to send 66 updates per second. They fail to make use of the 66 updates per second. They're like rate-hackers, but they can't help it. Could this "coincidence" happen when players complain about bad hitreg. Players would complain about bad hitreg because they can't hit the low FPS players because of their low update rates (because they can't send more updates than their FPS is). Then server admin takes a look at the server FPS and makes some hasty conclusions... coincidence... maybe...

Spartanfrog Wrote:CSS what statistic program do you use? The one you made?
Exactly.

This one http://forums.srcds.com/viewtopic/7026
Reply
#5
I doubt bad reg could be caused by client side FPS just because than reg would be bad pretty much all the time. I personally think most of the bad reg is in those uptight CAL player's heads. But who really knows for sure.

Actually I can get three 512 stable servers running on X2 2.2 Ghz Athlon 64. So far the only "tests" i have done were just noticing the FPS is slightly more stable.
realchamp Wrote:
Hazz Wrote:Has someone helped you on these forums? If so, help someone else
Mooga Wrote:OrangeBox is a WHORE.
Reply
#6
Spartanfrog Wrote:I doubt bad reg could be caused by client side FPS just because than reg would be bad pretty much all the time. I personally think most of the bad reg is in those uptight CAL player's heads. But who really knows for sure.
Have you seen someone with bad rates ever? The movement is jiggy and the player is diffcult to hit. That's bad hitreg. The same effect can be achieved with bad FPS, even if the rate settings are OK.

The low FPS theory works like this.

Server is 66 tick. Player's FPS is going up and down from 20 to 60 because the map is too FPS heavy for his computer. The server has somewhat strict rules: "sv_minupdaterate 50" and "sv_mincmdrate 50". Now, the client (ie. player) says when joining the server "okay, I'll do 50 updates per second". Obviously the client doesn't know that the computer isn't capable of doing the 50 updates per second. The client and the server now assume that there will be 50 updates per second. In reality the client FPS goes from 20 to 60 and during that time it sends updates in varying intervals. The server tries to compensate all the time for the "lost" packets, but it's doing bad job because all the time the client changes the rate in which it sends updates (because the client FPS is fluctuating). The server can't do accurate simulation, thus, it causes bad hitreg.

I've been working on a Linux iptables filter (aka. firewall) which will drop low FPS players. The filter will first measure how many packets the players send to the server per second. Then if the iptables filter doesn't see for example 50 packets per second from certain host, it'll add the host to banned list. That way it's impossible for any player to play on the server with bad rates. That should make *everybody* on the server play with reasonably high rates. In the current game server there are options to "force" the client do the requested rate limits, but they're not doing anything if the player's computer is the limiting factor. The settings in the server will only change player's rate settings accordingly, but it doesn't do anything if the player still sends only 20 updates per second because of low FPS.
Reply
#7
Well i'll move my issue to this thread.

I'm doing a lot of experiments to see why FPS drops on my server.

One thing I have noticed is when I'm running only 1 server, the fps is stable. Once I start opening more servers, though they are empty, the FPS jumps around too much.

SourceTV:
I tried turning source TV on and off on all server.. It looks like it helps but I don't think its the main cause

EDIT:

I think I might be getting somewhere. I switched on all servers, set their FPS to 60 and then set 1 to 500+.
There were very little jumps... though 60 might be unplayable, I might introduce a script that will set all servers to 60 fps when empty and for each server that becomes full set fps to ( 500 / number of servers with players in it)

This way on all my servers added up total fps would be around 600 - 800.
Reply
#8
Hmm, interesting, thanks for all the replies.

The machine we are using is (afaik) a Xeon (Quad core). We are running like 15 servers on it (not all TF2). The fps_max hasn't been changed from the default of 300 so I might try raising that and then seeing what the minimum FPS we hit is. It seems to me that dropping to 30 would cause bad hit reg even if all clients have (and can handle) high rates. If the server is only calculating 30fps then on a 66 tick server doesn't that mean things will only be updated (roughly) every 2nd tick?
Reply
#9
As long as the FPS is above the tickrate you won't notice any lag, it'll work perfectly fine, just the higher FPS you have the "smoother" the game will run.
Join the Source Dedicated Server Support Group on Steam Community!
Source Dedicated Server (SRCDS)
Free to join, Live support! (When available)

http://forums.srcds.com/viewtopic/5114
Reply
#10
Drocona Wrote:As long as the FPS is above the tickrate you won't notice any lag, it'll work perfectly fine, just the higher FPS you have the "smoother" the game will run.
I have to disagree a little.

On a 66 tick server with 66 FPS the game simulation update interval is 15 ms (1000 ms / 66). However, the players can get to more detail. That's because player's updates are not synchronized to come at the same time. They can be "overlapping", or however it could be described.


This is example I just wrote. Assume there are two players and two servers. The players go take a quick scrim on both servers to see which one is better.

| = Input from player
F = Server calculates frame
. = Nothing happens

Code:
Player 1: |.............|..............|..............|..............|..............|.....​..
Player 2: .......|.............|..............|..............|..............|.............​.|

Server 1: .F.............F..............F..............F..............F..............F....​..
Server 2: .F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F​.F

Time    : 12345678901234567890123456789012345678901234567890123456789012345678901234567890​12
                   1         2         3         4         5         6         7         8
See what happens at time 8. The servers have just received updates from P1 (time 1) and P2 (time 8). However, Server 1 has not updated the world simulation at time 8 yet. Server 1 will update the world simulation next time at time 16. Server 2 has more frames, so it updates the world simulation right after it receives update from P2 at time 8. Thus, the Server 2 is more accurately keeping up the world simulation.

Server updates for players are excluded from the ASCII graph, because the server most likely keeps separate update interval for each player. That way the server sends one update every 15 ms for each player, but it doesn't happen at the same time for each player.
Reply
#11
Thanks for writing that down, as I said, higher FPS will make things smoother but as long as the FPS is higher than the tickrate LAG won't be seen anywhere (you won't notice FPS lag below 20 tickrate actually... depending on how you define lag.

I didn't feel like explaining that whole thing again Toungue
Join the Source Dedicated Server Support Group on Steam Community!
Source Dedicated Server (SRCDS)
Free to join, Live support! (When available)

http://forums.srcds.com/viewtopic/5114
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)