03-20-2009, 11:28 AM
Judging from my graph in the above, is that a normal server behavior? Or I should use a different hosting?
Thanks for the help.
Thanks for the help.
Are the server FPS actually important?
|
03-20-2009, 11:28 AM
Judging from my graph in the above, is that a normal server behavior? Or I should use a different hosting?
Thanks for the help.
03-21-2009, 05:00 PM
Thats a graph from hlstatsX which btw IS a bad reference to use for server FPS. I have a server that stays empty most of the day and has amazingly stable fps but my hlstatsX graph doesnt show it correctly. If you want an accurate gauge wait till it has some players and visit http://fpsmeter.fragaholics.de enter your server info wait about ten minutes and check on the graph.
*Windows lack of output*
You: Hey, I want to run this program! Windows: Ok.. It crashed... Now what? Give up? You:...wtf... *linux output helpful?* You: ./My_program Linux:...Failed!...oo kitties! You:...wtf...
04-21-2009, 12:31 PM
WHY HIGHER FPS?
The key reason to run higher FPS is the render time. At 1000FPS, the server is rendering one frame every 1 millisecond (ms). This means that the worst-case adder to the player ping is only 1ms, IE: the player gets more accurate data and can get it more often. At 300FPS it's only 3ms which is perfectly acceptable, but at 100FPS it's 10ms, which is a significant percentage of a 100 ping (10%). A player with a 100 ping would actually be getting 110ms response time from the server. Many AMD systems will only run 60FPS without the ping booster which is 17ms. This is still not too significant but it can change the feel and response time of your server for players. Without the FPS Boost your server will use significantly less CPU but accuracy may suffer. From http://supportwiki.steampowered.com/wiki/Optimizing_a_Dedicated_Server
04-21-2009, 01:47 PM
These are not based on solid evidence though. No one really knows how the net code and srcds operate excpet valve themselves. Besides, anything above 1,000 is useless because the gain is still only 1ms even a million fps wouldn't be any different. In the end it all comes down to interpolation and client side rates. Unless your running at a lan lets be honest, that 1000fps isn't going to give you much benefit over the 500fps or something slightly more stable. Specially when you factor in the fact that the servers tickrate is only 100 at best.
*Windows lack of output*
You: Hey, I want to run this program! Windows: Ok.. It crashed... Now what? Give up? You:...wtf... *linux output helpful?* You: ./My_program Linux:...Failed!...oo kitties! You:...wtf...
04-24-2009, 02:59 AM
(This post was last modified: 04-24-2009, 03:01 AM by BrutalGoerge.)
if you weren't watching the graph all the time, Im pretty sure no one would be noticing 300 vs 500 vs 1000. I've tried my hardtest to 'feel' any differences from 333 to 1000 in-game, but it has not happened.
I even got 12 people together on a 100 tick css dust2, played for an hour at 300 fps, then played at 1000 fps for an hour. I didn't tell them I was changing it, and no one seemed to notice any difference. I asked them right after I changed it, and again after the hour was over. Nothing. And the fps graph was a pretty stable 980, and 300 respectively. So I say all that matters is what happens in-game, and if people can't feel\see any difference 300/500/1000fps, then why bother with it? Now goldsrc games are different, reg seemed to suck at anything less than 500... at least with my stuff it did |
« Next Oldest | Next Newest »
|