SRCDS Steam group


1000FPS Server: FPS Not Stable
#1
Hey,

Looking for a bit of help here. I am running a Team Fortress 2 server on a Linux Debian 5.0 (Lenny) machine. The machine specs are:

CPU: P4 2.8Ghz
RAM: 2GB DDR2-667
NET: 100Mbps

I used the fragaholics guide to Linux Kernel Optimization: http://wiki.fragaholics.de/index.php/EN:Linux_Kernel_Optimization

I'm getting really inconsistent and not stable results when testing the server's FPS. See graph below.
[Image: server_fps.gif]

It's in the upper 900's, but it's not flat and stable like I see on a lot of other people's.

Can anyone offer any advice or tips that might improve this?
I'm also unsure if I've setup the realtime scheduling correctly. I followed what the guide said, but the note about the sirq-hrtimer has me a little confused, not sure what to do with that.

Also unclear if the idler is working correctly or not, if it is, how do I keep it running with the server? Use Screen?

Thanks in advance!
Reply
#2
I wouldn't say those are actually inconsistant FPS. You have to understand that the FPS gets more and more unstable when it increases. So for a 1000FPS server your drops are nothing and you should ignore them. They surely also don't cause any lag on the server. I'm not a real pro at this (I'm having problems myself with this aswell), but I think I'm right^^
Reply
#3
OK, so after some more testing I opened it up and let the public play on it. Here's the results:

[Image: server_fps_2.gif]

This is bad... anyone have tips for this?
Reply
#4
Hi there!

Try to post your cpu usage while the fps drops happens, I agree this is bad.
Reply
#5
a P4 is a very old cpu. do you have other servers running at it? it might just be at its limits...

(your first graph shows just a perfect running server - if you can reach this with players on the server, you are done!)
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#6
Well that's just it... when no one is playing on the server it's fine, but as soon as people get on the server it looks like the 2nd graph. Is this a CPU limitation? At the time these graphs were taken, this was the only server running on this machine.

BehaartesEtwas,
Is there any chance you could also graph the CPU usage on that FPS graph?
Reply
#7
BehaartesEtwas,

I just found something very interesting... I set my "fps_max" and "host_framerate" to 1000 and I have a 1000FPS server. Locked and steady. No fluctuation at all, and minimal CPU usage, maybe 5% at most. However when using your FPS meter it sends an rcon command to change the the "host_framerate" to 0. When this happens I get all sorts of fluctuations. Is there any specific reason for this?
Reply
#8
this is intentionally, because host_framerate just makes stats return always the given value. the improvement is completely fake...
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)