12-26-2008, 02:01 PM
I'm running a left 4 dead srcds and it's echo'ing a near constant 29fps on the console screen. It doesn't seem to change much at all even when the server is full or empty.
I've been reading a few posts how people have 300 fps or more on their servers. I skimmed through Whisper's article on tickrate and I've been trying to understand that... not to much luck as of yet.
I don't have any server performance settings in my cfg so whatever the default settings are is what they are set to.
My server computer has dual core 2.5 Athlon and with the server full (8 people) the CPU utilization stays around 8%. I also have 4 gigs of dual channel ram as well.
I was thinking the server fps wasn't normal and if it wasn't what could be some possible reasons? I was also a bit curious about tickrate as well and what most people set there servers at.
I've been playing around with a couple settings from whispers article like sv_maxupdaterate and the console spit back "unknown command." Was wondering if anyone else had that problem when running a Left 4 Dead server.
Thanks,
Paul
I've been reading a few posts how people have 300 fps or more on their servers. I skimmed through Whisper's article on tickrate and I've been trying to understand that... not to much luck as of yet.
I don't have any server performance settings in my cfg so whatever the default settings are is what they are set to.
My server computer has dual core 2.5 Athlon and with the server full (8 people) the CPU utilization stays around 8%. I also have 4 gigs of dual channel ram as well.
I was thinking the server fps wasn't normal and if it wasn't what could be some possible reasons? I was also a bit curious about tickrate as well and what most people set there servers at.
I've been playing around with a couple settings from whispers article like sv_maxupdaterate and the console spit back "unknown command." Was wondering if anyone else had that problem when running a Left 4 Dead server.
Thanks,
Paul