06-13-2008, 03:05 AM
iliketohostservers Wrote:At 250 FPS the game server is able to process 1 packet every 4ms. At 500 FPS it is able to process 1 packet every 2ms.
I think you've slightly misinterpreted the articles.
A 250 FPS server means the server will check 250 times per second what network data has arrived. It means the server knows when the command packet arrived on the server with 1/250 second accuracy. That's the 4 ms. In-game it means that the server doesn't see any difference on command packets that arrive in 4 ms range. For the server those commands happen exactly at the same time, although in reality there could be the 4 ms difference. With more FPS the server knows more accurately when the event happened. It will "process", ie. timestamp, all packets that the ethernet card has received since last frame.
The server doesn't do anything with the packets except mark them with timestamp on a "frame". On a "tick" the server processes the data received on the frames. Then it'll know who shot first, and if the enemy dodged the bullet in time. If server FPS is low, then the "fire" and "dodge" commands may get the same timestamp, and that causes "omfg I was already behind the box"

I've pondered this issue earlier in my post at http://forums.srcds.com/showthread.php?tid=7165&pid=39471#pid39471
The thing I don't get is why Source server is so inaccurate about the network timestamps. If I run "tcpdump -n 'dst port 27015'" on my server, I get timestamps with nanosecond accuracy. Here's dump:
Code:
(IPs obfuscated)
19:48:56.511568 IP xy.152.233.83 > server: UDP, length 35
19:48:56.512224 IP xy.221.15.159 > server: UDP, length 64
19:48:56.512226 IP xy.124.53.17 > server: UDP, length 41
19:48:56.512838 IP xy.210.48.236 > server: UDP, length 41
19:48:56.513924 IP xy.52.189.27 > server: UDP, length 52
19:48:56.515868 IP xy.57.33.2 > server: UDP, length 35
19:48:56.517015 IP xy.29.183.175 > server: UDP, length 49
19:48:56.517018 IP xy.174.63.1 > server: UDP, length 40
^ ^ ^
1 2 3
1 = second
2 = millisecond
3 = nanosecond
The server is 66 tickrate running 24 players.
A game server with 100 FPS would see these commands happen at the same time. My ~500 FPS server sees them apparently quite well apart, because it can make difference in 2 ms intervals.
I don't get it why the Source server doesn't get to the nanosecond accuracy. I could probably pipe the output of the tcpdump to the game server and still reach better accuracy

@iliketohostservers.
Great post. I also like the one-core-one-server concept.
If you're interested I can add server FPS tracking for some of your server. I've already set up neat 1000 FPS assurance for Ryan's servers
