SRCDS Steam group


FPS booster
#1
So I'm trying to get my CS:S server FPS to boost higher

it currently runs ~590 FPS, I got the 100tick working

I remember this hardware ran the CS:S server at over 2000 fps, but I just can't get the right settings to do it again

I have FPS_MAX 4000 in startup command...


Thing is, the system does not have Windows Media player installed, I remember that helping... Am I at a loss here since I removed that?
Reply
#2
http://www.gamebanana.com/tools/1529

The comments say it's good, but I've never tried it.
Reply
#3
complete nonsense sorry. run with tick 66, tick 100 is worse. and run with fps equal tickrate, i.e. fps_max 66.66. that will lead to the best in-game experience.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#4
(08-10-2011, 06:03 PM)BehaartesEtwas Wrote:  complete nonsense sorry. run with tick 66, tick 100 is worse. and run with fps equal tickrate, i.e. fps_max 66.66. that will lead to the best in-game experience.

In your opinion, any difference between 66.66 and 66.67? Been running at 66.67 and everything seems good.
Reply
#5
(08-11-2011, 07:58 PM)joe.powell Wrote:  
(08-10-2011, 06:03 PM)BehaartesEtwas Wrote:  complete nonsense sorry. run with tick 66, tick 100 is worse. and run with fps equal tickrate, i.e. fps_max 66.66. that will lead to the best in-game experience.

In your opinion, any difference between 66.66 and 66.67? Been running at 66.67 and everything seems good.

my god people, you're talking hundredths of a millisecond. whoever the **** thought of 33 and 66 for tick rates was a complete idiot because then you have all these additional compensations for rounding, and additional floating point calculations when it's not a whole number. just use 30 or 60 tick rate and be done with it. as long as frame rate is at or above the tick rate it makes no difference in-game. 60.0 would be optimal, but hell even 32 would be a better number than 33


and i've officially concluded that on a 32 player maxed out server, tick rate and fps makes absolutely ZERO cpu usage changes. i even tried the tick rate at 10 ffs and it still was at ~20% of a core
Reply
#6
(08-11-2011, 09:55 PM)click4dylan Wrote:  
(08-11-2011, 07:58 PM)joe.powell Wrote:  
(08-10-2011, 06:03 PM)BehaartesEtwas Wrote:  complete nonsense sorry. run with tick 66, tick 100 is worse. and run with fps equal tickrate, i.e. fps_max 66.66. that will lead to the best in-game experience.

In your opinion, any difference between 66.66 and 66.67? Been running at 66.67 and everything seems good.

my god people, you're talking hundredths of a millisecond. whoever the **** thought of 33 and 66 for tick rates was a complete idiot because then you have all these additional compensations for rounding, and additional floating point calculations when it's not a whole number. just use 30 or 60 tick rate and be done with it. as long as frame rate is at or above the tick rate it makes no difference in-game. 60.0 would be optimal, but hell even 32 would be a better number than 33


and i've officially concluded that on a 32 player maxed out server, tick rate and fps makes absolutely ZERO cpu usage changes. i even tried the tick rate at 10 ffs and it still was at ~20% of a core

20%?

My servers full it uses 5-10% of a core.
Reply
#7
This thread is getting derailed. Simply enough don't FPS boost. End of story.
Game Servers -- CentralFrag.com -- Use promocode "frag" for 15% off every month!
Reply
#8
(08-11-2011, 07:58 PM)joe.powell Wrote:  In your opinion, any difference between 66.66 and 66.67? Been running at 66.67 and everything seems good.

I don't think anyone can fell or even measure any difference. but in theory, 66.66 is better, as 66.67 is already bigger than the tickrate (66 2/3). so that will lead (due to the aliasing effect) to a gap in the tickrate every ~90 seconds (if I did the math right). that gap is twice as large as the normal distance between two ticks, so you will not notice that unless you are actually firing in that exact moment *and* have a very tight lerp / cl_interp setting....

so: no, there is no difference, but I would run with 66.66.
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)