11-05-2010, 07:17 PM
I know there is exact usage. But I would like to know if anyone knows the average CPU usage of SourceTV and HLTV. I am talking about the relay server you can connect to a gameserver and not the intigrated tv.
Bandwidth usage is easy to limit and estimate with the tv_rate cvar.
But with cpu it is different:
For sourcTV there is tv_snapshotrate. I guess something similar to tickrate?
But how man slots can a modern core handle? Most leagues force a snapshotrate of 24 (16 default). A rough estimate for playerslots is 30-40 per modern cpu core. With SourceTV it must be higher?!
I found this cvars/cmds for hltv:
- fullupdateinterval
- maxframes
- cachesize
I guess the have an impact on cpu usage?
It would be nice if somebody could clear things up a bit.
Bandwidth usage is easy to limit and estimate with the tv_rate cvar.
But with cpu it is different:
For sourcTV there is tv_snapshotrate. I guess something similar to tickrate?
But how man slots can a modern core handle? Most leagues force a snapshotrate of 24 (16 default). A rough estimate for playerslots is 30-40 per modern cpu core. With SourceTV it must be higher?!
I found this cvars/cmds for hltv:
- fullupdateinterval
- maxframes
- cachesize
I guess the have an impact on cpu usage?
It would be nice if somebody could clear things up a bit.
Interactive web based config creator for CS, CSS, TF2 and DODS
Creates server and client configs in an explained dialog.
You`ll also find precompiled debian gameserver kernels for download
Creates server and client configs in an explained dialog.
You`ll also find precompiled debian gameserver kernels for download