SRCDS Steam group


2000 & 3000 FPS?
#46
Well I have been browsing this forum for a while. When I saw this I had to reply. Because I myself am curious to see how a 2000fps server is possible.
Here is the technical explaination as I understand it why a 1000fps server is superior to a 100 fps server. While a 100 tick server only takes calculations once per tick imagine this if you will:: we have 2 servers one is running at 100fps the other is running at 1000fps the 100 tick server takes a calc. every tick which is basically every frame however from from the time the frame is taken to the time the tick calc is measured they could in fact be out of sync lets say a tick is taken each second but the frame is taken at the half of each second you now have half a second between the time a tick is taken from the time a frame is taken. Lets say its even worse case scenario the tick is taken each second and the Frame is taken a millisecond after the tick is taken now you have almost an entire second from the time the frame was taken to the time the tick was taken. Now we have our 1000fps server which takes 10 frames every tick even if they are out of sync the time from the last calculation is still more accurate then the 100fps server at most it could only be about 10 milliseconds off. Now as we all know a tick calculation is nothing but data on a players position his aim his direction and what he is doing, but it also does one more thing it predicts where the player will be by the next tick. This is known as interpolation. A 1000fps server will be able to give more accurate data for that prediction than a 100fps server could. (also please note ticks arent really taken each second its just for example purposes). This is where the so called pro gamer says that the 1000fps is better than the 100fps because of interpolation. If the server has more accurate data than it in theory can more accurately predict where the client will be before the next tick. This is by no means a total fact because this may not be how srcds in fact does interpolation it may only use the very last frame before the tick to do interpolation. Since pro gamers use tweaked configs/improved and "proper" rates they benefit the most from higher fps servers since the data they are sending is more often then that of someone with lets say "not so great of settings".

Now as I read on a thread from somewhere I forget the place that a srcds server CAN'T run more than 1000fps not due to the hardware or the machine it's running on. I read that its because after each frame is taken there is a sleepwaittime variable that forces the server to wait for a bit before taking another frame thus limiting its ability to surpass 1000fps without modifying the game itself? The person who did modify the game was actually able to produce a server fps around 34,000 but it was unplayable. Please correct me if I'm wrong but has anyone here ever produced a 2000fps server? If so then lets get this thread back on the original topic.
*Windows lack of output*
You: Hey, I want to run this program!
Windows: Ok.. It crashed... Now what? Give up?
You:...wtf...
*linux output helpful?*
You: ./My_program
Linux:...Failed!...oo kitties!
You:...wtf...
Reply
#47
That's rather a confusing argument which I'm not sure I agree with personally, as we don't really know properly what happens in a "frame" and a "tick" in regards to where the user input goes and when the processing is done.

Something you need to consider is that clients are typically going to be at 100 ticks (and frames, but this is largely irrelevant) per second. Now obviously due to network links and timing all of the clients' ticks aren't going to arrive at the same time. With a 100 tick system you should theoretically have 10ms to get your tick in before the next tick comes up right?

That means you've got 9 milliseconds to get your tick in, so it can be 0 to 9 milliseconds away from another client's tick, right?

That'd lead me to say that the average tick-distance would be around 5ish milliseconds in relation to the other clients, making calculations only accurate to every 5ms, or 200Hz. Here's a bad ascii representation of what I'm try to say.

Code:
0         1         2         3         4         5 // Ticks
012345678901234567890123456789012345678901234567890 // Frames, or milliseconds (assuming 100ticks/s)
.+-       ±         +        -    +    -     +-     // Player updates, + for client 1, - for client 2, ± for both on the same frame
(Please ignore the period, can't get leading spaces so I need that)

I guess what I'm really thinking is "is there any point running over 200fps on a 100tick server?"

Anyway, that probably doesn't make any sense, I'm going to go with the lack of sleep excuse Smile


I believe you have to modify the C libraries (notably the gettimeofday() function and sleep()) to hit over 1000fps, but it's certainly possibly on FreeBSD, and most probably Linux too if you put enough time into it.
Reply
#48
I have a hard time buying into any of this when I see or feel no difference in game with server fps as low as 50, and has high as 990.

I think the only reason to do any tweaks to get over 300FPS and 1000fps and higher is if you're selling servers, and you want to take advantage of the suckers with money who think that 1000fps will make their server sheer bliss.
Reply
#49
As a player myself I do see benefits from over 500 fps. I used to run my servers on just stock kernel with about 120 fps for the server registration is notably poor even with good bandwith, ping and rates. Its notice especially more often with servers that don't hold a constant fps. Disco i see your argument with clients upload and speaking in terms of real world updates. Mine was meant primarily in a perfect world or a lan environment of course the weakest point to all of this will always be network there is no amount of mathematics that will solve this its just a hurtle we have to deal with. Also what about interpolation short of being someone who actually developed this system is there any documentation out there that has facts to support any theories we have on "tick" vs "frames" and the same about interpolation if interpolation is produced in the way I mentioned then yes we can say there is a difference in having a high FPS server. I do agree on some parts of these so called pro's being brainwashed into thinking 1000fps is the only way to go. At the same time I believe that 100 or 200 simply doesn't compare to something like lets say 500 or 600 I can notice the difference myself being that I don't care to play on 1000 or 100 I can't say ill go out and purchase a 1000fps server simply because I want to play competitively. Now modifying the gettimeofday() I can see where the CPU would benefit from that as well as the sleep function but then why was it put there in the first place??
*Windows lack of output*
You: Hey, I want to run this program!
Windows: Ok.. It crashed... Now what? Give up?
You:...wtf...
*linux output helpful?*
You: ./My_program
Linux:...Failed!...oo kitties!
You:...wtf...
Reply
#50
Disco, your ASCII graph reminds me of my own shot at it Smile

Here is my explanation why high-FPS server is more accurate than low-FPS server although both servers are having higher FPS than the tickrate.

http://forums.srcds.com/viewpost/39523#pid39523

You made one assumption that doesn't hold true. Clients (players) don't achieve tickrate quite often. There are statistics from 66 tick server at http://css.setti.info/rates.html. The statistics are generated with Player rate tracking system using Linux's firewall (iptables) (IIRC the file there isn't the most updated, so if somebody's interested PM me for the most updated version). There are usually about 3-5 players of 24 having rate around ~66. Few weeks ago also I was one of the ~40 raters.

You've got to give credit for the game server of handling all these "variable rate players" so well. The game server can't know if it's going to receive 40 or 60 updates from any player in the next second. Because of these "variable rate players" I've been thinking whether it would be better for them to set CVAR "rate" to match their average FPS. Then the game server would at least receive predictable number of updates from all players all the time. Currently the situation is that the server sometimes receives update every 10 ms and sometimes every 25 ms etc.
Reply
#51
True that's another issue, but really we're discussing a professional gaming environment where most clients do put out 100 ticks per second (which realistically is very easy these days).

I can't see that cvar system you're describing working in any way at all, it seems to be cl_cmdrate with the difference that it's set to what the client *should* be sending, as opposed to the limit. It'd have to be user definable and is totally open to abuse, not to mention that FPS, and therefore rates, can vary wildly between different maps and scenarios
Reply
#52
Yes there will always be limitations to this in terms of networks not everyone has the "perfect" connection. So for arguments sake lets pretend everyone has decent ISP and rates.
*Windows lack of output*
You: Hey, I want to run this program!
Windows: Ok.. It crashed... Now what? Give up?
You:...wtf...
*linux output helpful?*
You: ./My_program
Linux:...Failed!...oo kitties!
You:...wtf...
Reply
#53
Well, my work is always regarding public 24 slot server. Players want perfection there too Smile
Reply


Forum Jump:


Users browsing this thread: 10 Guest(s)