SRCDS Steam group


happy x-mas: more than 1000 fps :-)
#46
i know about the scense or not about the lib. but most of gamers want these senceless 10k fps. my problem is to reach it on newer amd cores (phenom x6).
Reply
#47
(01-14-2011, 05:46 PM)Peter_Pan123 Wrote:  i know about the scense or not about the lib. but most of gamers want these senceless 10k fps.
time to change that.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#48
Hi,


You say that with OB its pointless having more FPS than the tickrate, i have read your article and I must say that as a gamer, playing competitively I have noticed the difference between a 350fps and a 1000fps server, both on the same dedicated server...etc and 1000fps allways had better reg.


I did a little poll between some friends without letting them know what server was each and in most cases 80-90% they noticed which one was the 1000fps servers.


Still I have doubts about this article of yours that says its pointless having more fps than the tickrate, but I respect it due to your knowledge in this matter.

Could you clarify how did you get to that conclusion?

Thanks
Reply
#49
I don't actually know how you did this test. I've seen servers getting strong fps variations when they run with 66 fps instead of 1000. Keep in mind, the difference between 500 and 1000 fps (1 millisecond) matches the difference between 64 and 66 fps (also 1 millisecond). Also - I am currently in contact with a valve developer to clarify this - it is possible that there is still some effect, if the fps are not exactly matching the tickrate. So maybe 350 fps or 100 fps are worse then 66 fps. I did only a comparison of 66 and 1000 fps.

Also make very sure that you do not change anything else. And to make this test right, you need to verify that your players cannot derive the state of your server from any other source. especially the net_graph might show the server fps, so install e.g. zBlock which does not allow those net_graph settings. Also you will want them not to know *when* you did a change, e.g. make a poll every day, but make changes at random intervals (e.g. by really using a die every morning that decides whether you do the change or not). If you then find a statistically significant difference, I will believe you and revise my statement :-) (ok, I will try to reproduce your result first :-D)

I didn't do such an extensive test, but I did several blind switches and asked my players a couple of days later if they would note any difference. Also I did not note any difference myself, even though I knew the state of the server.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#50
I did a new test last night,


What people noticed when I changed de fps (I did not say to which), when it was fps_max 70 the ragdolls (game models) moved a bit chunky, and when it was 1000fps they moved smoother.


Also I would even say that the reg was better on fps_max 70 having constant 69.x fps than 950-1000fps.

Im very suprised.
Reply
#51
how do you get something like 69.x fps? fps_max 70 is supposed to produce steady 66.67 fps (well, a tiny bit below that typically). I suspect there is something odd with your system clock. I've seen this somewhere else, but I don't know yet a solution. can you try setting a different clock source (see the playing around section of my howto)? maybe also the "idler" can help, I don't know...
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#52
Sorry for epic bump i guess, but i just have to say, when i run this with FPS=67 it gets 66.5 fps all the time. It's really great!
Reply
#53
I think that's not different from fps_max 70 or so without any lib...
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#54
but if you belive in sourcejunkies > thread with 10000 fps normal you use at 66 fps every frame because of tick 66? there is a high risk if one frame ist missing that lags are following. the risk is lower if you use 333 or 500 fps, there you get more frames and use less of them, ok the adding frames are waste but its not really more cpu load to produce them.
Reply
#55
frames do not simply get lost. they might get delayed, but then it does not matter whether you are running with 66 fps or with 10 billion fps. what can happen is the so-called aliasing effect. that will be a problem if you run with slightly higher fps than the tickrate (like tick 66 and 68 fps) and result in a gap of almost twice the time between ticks once in a while. else the average tickrate would be too high... but that effect gets completely negligible once the time between two frames gets much lower than the time between two ticks. 200~300 fps are very much enough. anything higher has no positive effect.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#56
So 66.67 fps is the best, but 200 - 300 is "good" too? With good i mean that the so-called aliasing effect doesn't become a problem.
Reply
#57
correct. important is, that the fps are all the time that high. if they drop e.g. during heavy battle, it's not good of course.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#58
i have some other experience! i think that there is no difference if you start the server normally with fps_max 0 and get with hres arround 940fps on newer machines and if you now slow down the fps to fps_max 70 for arround 66,66fps the cpu load doesnt change. next i thought ok this could be but then i startet the fpsmeter for both conditions, and every time when the server was full with 24 players and had arround 40-50% CPU load (remembering cpu load doesnt change!) i got with both conditions fps drops to 10 and 0 FPS and got lags. normall when i think about ist, i would say > if you had less fps then you will have less cpu load ... but i cant accept because of my test. you can disable hres at my tests you will have less cpu load and more stable fps.
Reply
#59
When i'm using fps_max 300 the fps is stable at around 250, when full it drops to 177 according to the server console, via rcon however it shows 240..
Reply
#60
(06-17-2011, 05:10 PM)Peter_Pan123 Wrote:  i have some other experience! i think that there is no difference if you start the server normally with fps_max 0 and get with hres arround 940fps on newer machines and if you now slow down the fps to fps_max 70 for arround 66,66fps the cpu load doesnt change. next i thought ok this could be but then i startet the fpsmeter for both conditions, and every time when the server was full with 24 players and had arround 40-50% CPU load (remembering cpu load doesnt change!) i got with both conditions fps drops to 10 and 0 FPS and got lags. normall when i think about ist, i would say > if you had less fps then you will have less cpu load ... but i cant accept because of my test. you can disable hres at my tests you will have less cpu load and more stable fps.

HiRes timers are pure evil. I've always disabled them on any linux boxes I use. (Which I only have one left, ditching linux because it's simply awful)
http://leaf.dragonflybsd.org/~gary

“The two most common elements in the universe are hydrogen and stupidity.”








Reply


Forum Jump:


Users browsing this thread: 13 Guest(s)