SRCDS Steam group

1000FPS Server: FPS Not Stable

Looking for a bit of help here. I am running a Team Fortress 2 server on a Linux Debian 5.0 (Lenny) machine. The machine specs are:

CPU: P4 2.8Ghz
RAM: 2GB DDR2-667
NET: 100Mbps

I used the fragaholics guide to Linux Kernel Optimization:

I'm getting really inconsistent and not stable results when testing the server's FPS. See graph below.
[Image: server_fps.gif]

It's in the upper 900's, but it's not flat and stable like I see on a lot of other people's.

Can anyone offer any advice or tips that might improve this?
I'm also unsure if I've setup the realtime scheduling correctly. I followed what the guide said, but the note about the sirq-hrtimer has me a little confused, not sure what to do with that.

Also unclear if the idler is working correctly or not, if it is, how do I keep it running with the server? Use Screen?

Thanks in advance!
I wouldn't say those are actually inconsistant FPS. You have to understand that the FPS gets more and more unstable when it increases. So for a 1000FPS server your drops are nothing and you should ignore them. They surely also don't cause any lag on the server. I'm not a real pro at this (I'm having problems myself with this aswell), but I think I'm right^^
OK, so after some more testing I opened it up and let the public play on it. Here's the results:

[Image: server_fps_2.gif]

This is bad... anyone have tips for this?
Hi there!

Try to post your cpu usage while the fps drops happens, I agree this is bad.
a P4 is a very old cpu. do you have other servers running at it? it might just be at its limits...

(your first graph shows just a perfect running server - if you can reach this with players on the server, you are done!) (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Well that's just it... when no one is playing on the server it's fine, but as soon as people get on the server it looks like the 2nd graph. Is this a CPU limitation? At the time these graphs were taken, this was the only server running on this machine.

Is there any chance you could also graph the CPU usage on that FPS graph?

I just found something very interesting... I set my "fps_max" and "host_framerate" to 1000 and I have a 1000FPS server. Locked and steady. No fluctuation at all, and minimal CPU usage, maybe 5% at most. However when using your FPS meter it sends an rcon command to change the the "host_framerate" to 0. When this happens I get all sorts of fluctuations. Is there any specific reason for this?
this is intentionally, because host_framerate just makes stats return always the given value. the improvement is completely fake... (Linux Kernel HOWTO!)
Do not ask technical questions via PM!

Forum Jump:

Users browsing this thread: 1 Guest(s)