12-16-2008, 06:03 PM
(This post was last modified: 12-16-2008, 08:17 PM by afterhoursgaming.)
Well I have been browsing this forum for a while. When I saw this I had to reply. Because I myself am curious to see how a 2000fps server is possible.
Here is the technical explaination as I understand it why a 1000fps server is superior to a 100 fps server. While a 100 tick server only takes calculations once per tick imagine this if you will:: we have 2 servers one is running at 100fps the other is running at 1000fps the 100 tick server takes a calc. every tick which is basically every frame however from from the time the frame is taken to the time the tick calc is measured they could in fact be out of sync lets say a tick is taken each second but the frame is taken at the half of each second you now have half a second between the time a tick is taken from the time a frame is taken. Lets say its even worse case scenario the tick is taken each second and the Frame is taken a millisecond after the tick is taken now you have almost an entire second from the time the frame was taken to the time the tick was taken. Now we have our 1000fps server which takes 10 frames every tick even if they are out of sync the time from the last calculation is still more accurate then the 100fps server at most it could only be about 10 milliseconds off. Now as we all know a tick calculation is nothing but data on a players position his aim his direction and what he is doing, but it also does one more thing it predicts where the player will be by the next tick. This is known as interpolation. A 1000fps server will be able to give more accurate data for that prediction than a 100fps server could. (also please note ticks arent really taken each second its just for example purposes). This is where the so called pro gamer says that the 1000fps is better than the 100fps because of interpolation. If the server has more accurate data than it in theory can more accurately predict where the client will be before the next tick. This is by no means a total fact because this may not be how srcds in fact does interpolation it may only use the very last frame before the tick to do interpolation. Since pro gamers use tweaked configs/improved and "proper" rates they benefit the most from higher fps servers since the data they are sending is more often then that of someone with lets say "not so great of settings".
Now as I read on a thread from somewhere I forget the place that a srcds server CAN'T run more than 1000fps not due to the hardware or the machine it's running on. I read that its because after each frame is taken there is a sleepwaittime variable that forces the server to wait for a bit before taking another frame thus limiting its ability to surpass 1000fps without modifying the game itself? The person who did modify the game was actually able to produce a server fps around 34,000 but it was unplayable. Please correct me if I'm wrong but has anyone here ever produced a 2000fps server? If so then lets get this thread back on the original topic.
Here is the technical explaination as I understand it why a 1000fps server is superior to a 100 fps server. While a 100 tick server only takes calculations once per tick imagine this if you will:: we have 2 servers one is running at 100fps the other is running at 1000fps the 100 tick server takes a calc. every tick which is basically every frame however from from the time the frame is taken to the time the tick calc is measured they could in fact be out of sync lets say a tick is taken each second but the frame is taken at the half of each second you now have half a second between the time a tick is taken from the time a frame is taken. Lets say its even worse case scenario the tick is taken each second and the Frame is taken a millisecond after the tick is taken now you have almost an entire second from the time the frame was taken to the time the tick was taken. Now we have our 1000fps server which takes 10 frames every tick even if they are out of sync the time from the last calculation is still more accurate then the 100fps server at most it could only be about 10 milliseconds off. Now as we all know a tick calculation is nothing but data on a players position his aim his direction and what he is doing, but it also does one more thing it predicts where the player will be by the next tick. This is known as interpolation. A 1000fps server will be able to give more accurate data for that prediction than a 100fps server could. (also please note ticks arent really taken each second its just for example purposes). This is where the so called pro gamer says that the 1000fps is better than the 100fps because of interpolation. If the server has more accurate data than it in theory can more accurately predict where the client will be before the next tick. This is by no means a total fact because this may not be how srcds in fact does interpolation it may only use the very last frame before the tick to do interpolation. Since pro gamers use tweaked configs/improved and "proper" rates they benefit the most from higher fps servers since the data they are sending is more often then that of someone with lets say "not so great of settings".
Now as I read on a thread from somewhere I forget the place that a srcds server CAN'T run more than 1000fps not due to the hardware or the machine it's running on. I read that its because after each frame is taken there is a sleepwaittime variable that forces the server to wait for a bit before taking another frame thus limiting its ability to surpass 1000fps without modifying the game itself? The person who did modify the game was actually able to produce a server fps around 34,000 but it was unplayable. Please correct me if I'm wrong but has anyone here ever produced a 2000fps server? If so then lets get this thread back on the original topic.
*Windows lack of output*
You: Hey, I want to run this program!
Windows: Ok.. It crashed... Now what? Give up?
You:...wtf...
*linux output helpful?*
You: ./My_program
Linux:...Failed!...oo kitties!
You:...wtf...
You: Hey, I want to run this program!
Windows: Ok.. It crashed... Now what? Give up?
You:...wtf...
*linux output helpful?*
You: ./My_program
Linux:...Failed!...oo kitties!
You:...wtf...