10-02-2007, 03:21 PM
I'm currently running windows server 2003 and have noticed that the server I run sustains tickrate the best right after a system restart, then it degrades to like 60-80 after about an hour or so. Server fps is 512 when the computer is restarted, then after an hour the fps is very variable and is anywhere from 60-500. Can anyone explain to my why this is happening, and is there a way to fix it? Is there a memory leak in the srcds program or should i be using some kind of system settings.
I have the srcds windows kernal timer resolution program running. I have tried setting the server performance for programs and memory not for background services, either way the same problem happens.
System specs: dual core opteron 1216, 2 gigs ram, running Microsoft windows server 2003 standard edition.
Any suggestions will help. Thank you.
I have the srcds windows kernal timer resolution program running. I have tried setting the server performance for programs and memory not for background services, either way the same problem happens.
System specs: dual core opteron 1216, 2 gigs ram, running Microsoft windows server 2003 standard edition.
Any suggestions will help. Thank you.