SRCDS Steam group


fps issues
#1
Hi. I'm running an idler on my dedicated server and when i set fps_max to 69 i get 65 - 66.5 fps in the server. Is that just the sweet spot or is it too low?
[Image: 205t4id.png]
Reply
#2
The sweet spot would be exactly the tickrate (66.67 usually). I would recommend using the fps meter to get a more accurate value. Important is the average in this case (in fact the average over a small time period, but that's difficult to measure and should not be different from the average the fps meter is spitting out).
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#3
Here's what i got on it: http://www.fpsmeter.org/p,view;127652.html
Reply
#4
66.4 looks fine, it's slightly below 66.67, but below is better than above (-> aliasing effect). but your time rms is a bit high, not sure if that can anybody feel. maybe you can try running an idler (see the howto linked in my signature), sometimes that helps with those specific fluctuations.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#5
I'm running an idler. I used the howto in your signature to do it. Anything more i can do? I haven't installed a custom kernel yet, since the machine i'm running the gameservers on also hosts some webpages.
Reply
#6
Here is the new one, now running idlers on both cores; http://www.fpsmeter.org/p,view;127734.html
Reply
#7
i am confused? i learned that fps_max 66 is better then 69/70/75 or whatever, because of "66fps -> every 15ms one frame -> every frame is used". With fps_max 75 you will have 13ms and thats != 66 tick > its 75tick and refreshed the world wrong and you got through shots...

> normally you should use 200, 333 or 1000 but thats also wrong? the world refreshed with 3,03ms, 5,05ms and 15,15ms? its not better to reculculate the fps_max? like 1000 fps > 990fps its direcly 15ms? i would be pleased if you BehaartesEtwas give a info about that? i am really confused?

-peter
Reply
#8
(06-08-2011, 03:58 PM)Peter_Pan123 Wrote:  i am confused? i learned that fps_max 66 is better then 69/70/75 or whatever, because of "66fps -> every 15ms one frame -> every frame is used". With fps_max 75 you will have 13ms and thats != 66 tick > its 75tick and refreshed the world wrong and you got through shots...

> normally you should use 200, 333 or 1000 but thats also wrong? the world refreshed with 3,03ms, 5,05ms and 15,15ms? its not better to reculculate the fps_max? like 1000 fps > 990fps its direcly 15ms? i would be pleased if you BehaartesEtwas give a info about that? i am really confused?

-peter

http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide#Defining_your_goals

Heres some information that might be helpful.
Slå den med jeres fiberforbindelser...

[Image: 1308107839.png]
Reply
#9
Hm... Whenever i set the fps_max to e.g 67, 68 or 69 it shows a constant fps of 65, but when i change it to e.g 71 it shows 70 fps. Like it doesn't care about the values in between, any solution? Do fps_max have to be set at startup?
Reply
#10
fps_max does not define the fps the server "want's" run with. it defines the maximum fps (or maybe you can better understand it as the minimum time between two frames) the server is allowed to run with. it is not averaged and the server sleeps still 1ms everytime. after each millisecond sleep it will check whether the fps are higher than fps_max and sleep again 1ms in that case.

this automatically means only certain real fps values are possible. it depends a bit how long the actual frame processing takes and how much time is lost with each fps_max check. as a rule of thumb only integer milliseconds are possible, e.g.:
15ms = 1/66.67fps
14ms = 1/71.42fps
16ms = 1/62.50fps
(remember, fps is a frequency and thus the inverse of the time between the frames)

in reality fps will typically a bit lower, as the actual processing of the frame takes time.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#11
http://www.sourcejunkies.de/forum/thread/1176/page/1/

i read this article, >

sorry its in german please use the translater ^^

"Warum diese goldenen Werte und nicht z.b. fps_max 75 (die konstant gehalten werden müssen)?. Weil nicht mehr alle 15ms ein Frame vorhanden ist, sondern jetzt kommt er alle 13ms, dadurch kommt natürlich die Berechnung der Ticks (66 Updates pro Sekunde; 15ms =! 75 Updates pro Sekunde; 13ms) durcheinander bzw diese verzögert sich bis zum nächsten Frame und dadurch entsteht diese "es geht nicht rein"-Gefühl."

translate by google

"Why are these golden values​​, not e.g. fps_max 75 (which must be held constant)?. Because not all of 15ms, a frame is present, but now he comes every 13ms, thus, of course, is the calculation of ticks (66 updates per second;! 15ms = 75 updates per second, 13ms) from each other or will be delayed until the next frame and The result is that "it is not purely"feeling."

here is my problem he wrote that problem.

-peter
Reply
#12
So the fps i'm getting is acceptable? It's pretty stable when the server is full.
Reply
#13
hey that article quotes me, I am famous xD

(06-08-2011, 08:45 PM)michael_sj123 Wrote:  So the fps i'm getting is acceptable? It's pretty stable when the server is full.
that say both the article and me, yes ;-) a bit more precise (and English) explanation by me can be found also here: http://www.fpsmeter.org/p,fps.html
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#14
http://www.fpsmeter.org/p,view;129051.html <- new one.. 14 players online of 24.. should i run with higher fps?
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)