SRCDS Steam group


fps_max better setting ?
#1
Simple question about the "fps_max" setting Smile

I want to know what's the better setting for this cvar, I'm not requesting a 1000 FPS setting of course, I mean what's the best fps_max to avoid additional ms on the net rendering to the player.

Here it's the base definition of this setting :

https://support.steampowered.com/kb_article.php?ref=5386-HMJI-5162 Wrote:WHY HIGHER FPS?

The key reason to run higher FPS is the render time. At 1000FPS, the server is rendering one frame every 1 millisecond (ms). This means that the worst-case adder to the player ping is only 1ms, IE: the player gets more accurate data and can get it more often.
At 300FPS it's only 3ms which is perfectly acceptable, but at 100FPS it's 10ms, which is a significant percentage of a 100 ping (10%). A player with a 100 ping would actually be getting 110ms response time from the server. Many AMD systems will only run 60FPS without the ping booster which is 17ms.
This is still not too significant but it can change the feel and response time of your server for players.
Without the FPS Boost your server will use significantly less CPU but accuracy may suffer.

So my question is with thoses conditions :

War server, tickrate 66, max 100 ms player's ping, max 12 players on

I think fps_max 300 can be fair as mentionned on the definition, that's result by a 103 ms final rendering for a player having 100 ms ping.

But at fps_max 100 that's result in a 110 ms final rendering for a player having 100 ms ping, so I'm not sure to understand the way to calculate the final rendering ^^'

What's will be this final rendering for 200 fps on a player having 100 ms ?
Reply
#2
First of all it is related to windows servers. You easily can see that they are talking about windows because they write "Many AMD systems will only run 60FPS without the ping booster which is 17ms."

Also I guess you have not tried if that information is still correct or even made for linux at all. Connect to your server and look an the net_graph and change the fps_max values. You wont notice any difference...
Interactive web based config creator for CS, CSS, TF2 and DODS
Creates server and client configs in an explained dialog.

You`ll also find precompiled debian gameserver kernels for download
Reply
#3
(11-21-2010, 01:57 AM)Terrorkarotte Wrote:  First of all it is related to windows servers. You easily can see that they are talking about windows because they write "Many AMD systems will only run 60FPS without the ping booster which is 17ms."

First of all, about network setting I don't think that different between linux and windows servers ...

(11-21-2010, 01:57 AM)Terrorkarotte Wrote:  Also I guess you have not tried if that information is still correct or even made for linux at all. Connect to your server and look an the net_graph and change the fps_max values. You wont notice any difference...

Hum perhaps because informations showing on the net_graph is the result on a client and not about the fps of a server ?

http://developer.valvesoftware.com/wiki/TF2_Network_Graph Wrote:Areas of the Net Graph Display

Area 3
The local connection's frames per second and round trip ping to the server are shown in area 3.
Reply
#4
Linux is not Windows...

In the right hands and some knowledge the different net_graphs can show you so much than just simple client data...

Even if many people wrote you it is useless to increase the fps_max above the tickrate, you still think it will reduce the latencys. At least that is what the text says you posted.

Guess what the net_graph ping displays.

BTW: http://forums.srcds.com/viewpost/93672#pid93672

I guess you will disagree with valve developers as well becuase you can dig out some outdated texts somewhere...
Interactive web based config creator for CS, CSS, TF2 and DODS
Creates server and client configs in an explained dialog.

You`ll also find precompiled debian gameserver kernels for download
Reply
#5
Blablabla

Everytime you will come with another point to destruct.

Since that's not the first time we spoke in the same topic, my conclusion is you post 30% of time an answer to the requested question, 70% another times that's not really in relation with the based question.

Speak with you it's a waste of time, because you turn it many times in a debate.

I'm not here to speak about the net_graph output, about the difference with Windows and Linux in net settings of srcds, about the color of the sky or about the fact you go to the toilet or not this morning ...

My question is simple, what's the best "fps_max" settings ?

So now I don't care about what you talk, and I will simply wait after someone who want to answer at my question Smile
Reply
#6
simply: that argument (the link) is out-dated. since the orangebox update, the ping is no longer affected by the fps. the optimal setting for fps_max for orangebox-based games is a setting that results in fps exactly equal the tickrate (which is fixed for css at 66, but afaik still variable e.g. for dods).

this has (probably - I did not test this actually on windows) nothing to do with windows or linux, as on both platforms the fundamental architecture of the server is equal. with the orangebox valve introduced a dedicated thread for receiving and processing network packets, so there is no longer any delay between the arrival of the packet at the server and receiving the packet by srcds. before the OB update this delay added an arbitrary time to the (effective) ping which was different for each packet. therefore it presented an uncertainty in the order of the time between to frames for the engine.

but this was before the OB update. now there is no longer such delay. the fps only affect when packets are sent from the server to the clients. so the fps still need to be (at least) as high as the tickrate, else the updaterate (= effective tickrate) is reduced. as the updaterate does not get higher when rising the fps above the tickrate, higher fps will not change anything but increase the cpu usage.

does this answer your question sufficiently? :-)
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#7
(11-21-2010, 10:28 PM)BehaartesEtwas Wrote:  simply: that argument (the link) is out-dated. since the orangebox update, the ping is no longer affected by the fps. the optimal setting for fps_max for orangebox-based games is a setting that results in fps exactly equal the tickrate (which is fixed for css at 66, but afaik still variable e.g. for dods).

this has (probably - I did not test this actually on windows) nothing to do with windows or linux, as on both platforms the fundamental architecture of the server is equal. with the orangebox valve introduced a dedicated thread for receiving and processing network packets, so there is no longer any delay between the arrival of the packet at the server and receiving the packet by srcds. before the OB update this delay added an arbitrary time to the (effective) ping which was different for each packet. therefore it presented an uncertainty in the order of the time between to frames for the engine.

but this was before the OB update. now there is no longer such delay. the fps only affect when packets are sent from the server to the clients. so the fps still need to be (at least) as high as the tickrate, else the updaterate (= effective tickrate) is reduced. as the updaterate does not get higher when rising the fps above the tickrate, higher fps will not change anything but increase the cpu usage.

does this answer your question sufficiently? :-)

Perfect answer BehaartesEtwas Smile

That's why I already added +1 reputation for you.

Your are helpfull and you doesn't only give answer but help to understand, and everytime with respect...

I like to understand things not simply to apply it, so your answer it's like I love them Big Grin
Reply
#8
So perhaps now we can make fps meter obsolete Smile

or

achive stable high (1000) fps with that meeter, and then reduce it to something close or abowe the tick?
Reply
#9
(11-22-2010, 01:26 AM)lhffan Wrote:  So perhaps now we can make fps meter obsolete Smile

yes, kind of ;-) I plan adjusting the QI calculation and the display, so that higher fps do no longer look like an advantage. still, if fps are dropping below the tickrate the server quality is affected, so the fps meter is not completely obsolete ;-)
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)