SRCDS Steam group


Theory Based Hosting [Windows Server 2008 64-bit]
#1
 
Hello there, I want everyone to read my theory and base their opinions on it. I wish for you all to express your opinion if they are contributive.

My Theory
Now my theory is based on the current server setup

Dual Quad-Core Xeon 2.66Ghz
1333Mhz Bus
24GB RAM
3000GB of Bandwidth

As the theory of mine goes I can set in a text example,

Gameserver
20 Slots - Maximum
250 FPS - 33 to 66 Tick

CPU Cores
2.66 x 8 = 21.28Ghz

Now my current readings from a gameserver at 20 Slots 250 FPS 33 to 66 Tick set its between 5.53% to 8.83% CPU Usage per Core. Now my theory is basically this, if we have 8 Cores and per core the CPU usage is 5.53% (0.15Ghz) then the load should be distributed throughout the cores making it only (0.01875Ghz) per Gameserver.

Now if this theory is incorrect please explain to me why. My impression is the more cores increases the amount of applications per core by distributing the load evenly.

If you have any questions or need any more information please let me know.
Reply
#2
SRCDS is not multithreaded, so you will only be able to use it on one core.
Slå den med jeres fiberforbindelser...

[Image: 1308107839.png]
Reply
#3
So I will have to rethink this, if SRCDS is not multi-threaded I can still maintain the usage of my Cores by specifically assigning the SRCDS servers to each core my self is that correct?

By the way, if you are saying SRCDS is not multi-threaded, according to my monitoring it is showing SRCDS using multiple threads. It also is showing SRCDS running on all 4 CPU's on my test system...

If I am incorrect about the mutli-core support than I can adjust my gameservers per core and assign the SRCDS server's to those cores depending on their load.

Tell me if I am incorrect.

- Travis
Reply
#4
I've used the same processor setup before, with 8GB RAM.

You don't need to worry about assigning cores, Windows will sort that out for you. You should be able to easily run 3 x 20 slot SRCDS instances per core, so that would be 24 total game servers. All full you shouldn't be looking at more than 80 - 85% CPU usage.
Clan of Doom: www.clanofdoom.co.uk






Reply
#5
Thank you, I will be testing out this system to figure out what exactly the per-game CPU usage is on the actual live-server. 3-4 Game Servers per CORE should not be a problem if I maintain them at standard settings, 250-500 FPS, 33-66 Tick.

Thank you again.
Just an FYI, I have tested out on my CPU which is an AMD CPU the INTEL CPU should be able to handle more, but my point is I am not exceeding 8% CPU Usage per gameserver 20 Slots Full Capacity at 250 FPS 33-66 Tick.

So at this rate 4 Gameserver per Core at these numbers would only come to 0.8512 GHz / 2.66 Ghz, that is 32% Core Usage.

So as I keep crunching these numbers, I should be able to hold that many gameservers with 68% Additional Processing power per Core.

My biggest issue here was I believed I would not be able to pay for the server and make profit at the same time, but I now know I can.

How I found my answers
8.03% = 0.0803 x 2.66 = .213598 or (.214)Ghz
0.214 x 4 = 0.856Ghz per Core
.856 / .266 = 32% / 100% Core LOAD

So with all the game servers on load I will have an additional 10.64Ghz for processing.
Reply
#6
also keep in mind that the load produced by game servers is not equally distributed in time. they use 100% cpu during frame processing but 0% in between. if you want to do a medium quality mass hosting you can ignore this effect. but for high end servers you will want to prevent that two servers want/need to process a frame at the same time on the same cpu. the probability for this increases with the number of servers per core, starting with 0 for 1 server per core. so regardless how fast your cpu is, the optimal case will always be 1 server per core. so if you want to host many small servers it's probably better to have more cores with a lower speed. on the other hand it is very mandatory to have very fast cores for hosting large servers (as srcds more or less doesn't use more than 1 core).

also you will want to stay away from 100% load with a certain (not too small) margin, else it will start lagging (due to the unequal distribution of load in time).

and I am not sure how it is about measuring the cpu load on windows. on linux it doesn't work for game servers, as game servers switch between running and sleeping state as often or even more often than the kernel measures the cpu load. so cpu load is there more or less some random number. maybe it's the same on windows...

as you see, there are many possible problems with your theory. as we physicists always say: the experiment is the final judge. so simply try it out :-)
http://www.fpsmeter.org
http://wiki.fragaholics.de/index.php/EN:Linux_Optimization_Guide (Linux Kernel HOWTO!)
Do not ask technical questions via PM!
Reply
#7
Thank you BehaartesEtwas, the CPU states also changes between idle and running game servers. I have now tested on my system 3 game servers per core 20 Slot max 66 Tick 250 FPS which is the maximum we will be selling for performance starting out; and we have not exceeded more than 10% CPU Usage per game server, and that is at 15/20 - 20/20 - 10 / 20, player statistics.

So my common logic theory is and has been correct this entire time, except for the distribution load per all 8 cores.

Under a system,

Intel Xeon Quad-Core 1.6 Ghz 1333Mhz
6GB Ram

I could process 4 Game servers at 20 Maximum Slots, 250 FPS, 33-66 Tick, without peaking to the maximum core load. Even at 4 Game servers I am enabled 50% core Ghz left, but in this case almost 75%.

The idling and running state of the game servers when player join to activate server scripts shows an increase of 2.5% CPU usage for 1-2 seconds. Map changes show 5.35% Increase for 3-5 seconds. Even if all 4 game servers were changing maps at the same time; I would not feel the difference on a game server that was not. Now I have not taken in consideration for mod's such as zombie mod, gun game, or even surfing to figure out the cpu usage. But I can only until I test assume it will be between 3.5-7.5% increase on the load.

But that will all play out as we continue.

- Travis
Reply
#8
What processor model of Xeon are you using? Also, are your servers completely "clean" of plugins? I see that you are gonna sell servers, so you need to ensure that your gameservers will be running as good with plugins on. Otherwise it will result in unhappy customers. I would recommend you not assigning the servers to a core if you will be running more servers than the number of cores. If you leave it to the balancer, it might save you from big lags on your servers.

Deathmatch for example is one hell of a plugin, as it uses pretty much of my own xeon with 2.66 ghz, and that is with 32 players (perhaps OB just doesn't like big player counts??). Just keep that in mind.
Slå den med jeres fiberforbindelser...

[Image: 1308107839.png]
Reply
#9
We have recently tested Zombie & Deathmatch + Gungame modifications for CS:S. They all increase on our platform 7.5% Core Usage per Game Server.

Our actual system we have upgraded because we are getting it at a discount these are our new specifications,

Intel Xeon E7540
24x2.00 Ghz Cores
32GB DDR3 1066 RAM
4x300Gb SAS 15k RPM HD's
128 IP's

this is the new system we have tested the plugins on and with gungame / deathmatch we can have 2 of those per core. As a result with the increased cores we are making more profit.

Our Profit
This profit is based on standard servers without deathmatch or gungame modifications,

Min Profit
4 x 12 x 24 = 1152 Slots x 1.99 = $2292.48 - $335 (server cost) = $1957.48 Profit

Maximum Profit
4 x 20 x 24 = 1920 Slots x 1.99 = $3820.80 - $335 (server cost) = $3485.80 Profit

Below is a setup based on profit for gungame & deathmatch servers at 2 game servers per core,

Min profit
2 x 12 x 24 = 576 Slots x 1.99 = $1146.24 - $335 (server cost) = $881.24 Profit

Max Profit
2 x 20 x 24 = 960 Slots x 1.99 = $1910.40 - $335 (server cost) = $1575.40 Profit

So as you can see even if we sold all our game servers for gungame & deathmatch (unlikely) the load will be able to handle it.

We may actually imply on ordering if they are planning to use gungame and or deathmatch modifications to order high-cpu priority or possibly extra slot cost due to the fact they will be using so many resources, and that can be an increase from anywhere to $20-30 per gameserver.

- Travis
Reply
#10
Here is a simple equation for those wishing to find out their maximum CPU load per core per game server.

First Step
Find out how many servers you can run on your maximum settings per core without exceeding 40% Core Usage.

Take the results of those servers and find the average example,

4 Game Servers
250 FPS, 33 Tick
.07% Core Usage per Game Server x 4 = 28% Core Usage

Now that you have the number of servers and know the core usage per game server, do the following to figure out the total load.

Table
G = Game Power
CS = Core Speed
GPC = Games per Core
C = Cores
TS = Total CPU Speed
TL = Total Load Percent

(GP x CS) x GPC x C / TS = TL

My Example

(.07 x 2.00) x 4 x 24 / 48 = 28%

To find out the Ghz load

(.07 x 2.00) x 4 x 24 = 13.44Ghz

This is only as accurate as you make it. Remember you will have to find out your Games per Core, Core Speed, Game Processing Power per Game Server to figure out your total load, and even then this is only a theoretical probability equation. It does not accurately uphold the differences between each game server at idle and running state, nor does it justify the modifications per game server.

This test was done using Counter-Strike Source, 250 FPS, 20 Slots, 33 Tick. No modifications. We strongly suggest no one exceed 40% Core Usage on the game servers without any modifcations. This will allow 60% load capacity for any modification the user decides to install. This test was also done with each game server at 20/20 of players.

- Travis
Reply
#11
Do people even still bother with 33 tick servers?

Would have thought 66 as a minimum.
Reply
#12
66 Is the new 100 Tick I heard, but if you don't have a big budget 33 Tick is always the way to go. The difference between us and them is at 33 Tick we offer you 500 FPS Standard, and 66-100 Tick 250 FPS Standard. We can go up-to 1000 FPS if needed, but that again is if you have the budget.

The testing of our system is going great, this is now day 2 of our testing at we are still under 10% Total CPU Load. We have begun testing 20 Slot 1000 FPS and 66 Tick Servers. Next will be modified games running at maximum slots for 24 hours.
Reply
#13
(01-11-2011, 07:19 AM)Zero1028 Wrote:  66 Is the new 100 Tick I heard, but if you don't have a big budget 33 Tick is always the way to go. The difference between us and them is at 33 Tick we offer you 500 FPS Standard, and 66-100 Tick 250 FPS Standard. We can go up-to 1000 FPS if needed, but that again is if you have the budget.

The testing of our system is going great, this is now day 2 of our testing at we are still under 10% Total CPU Load. We have begun testing 20 Slot 1000 FPS and 66 Tick Servers. Next will be modified games running at maximum slots for 24 hours.

Just to inform you, 250/500/1000 FPS isn't important anymore. 67-70 FPS will be as good as 1000FPS.
Slå den med jeres fiberforbindelser...

[Image: 1308107839.png]
Reply
#14
Oh Really? If that is true, that is great news. We have spiked over 30% Per Game Servers at 750 FPS, I don't think I want to have to many of those running at one time lol.

- Travis
Reply
#15
(01-11-2011, 07:24 PM)Zero1028 Wrote:  Oh Really? If that is true, that is great news. We have spiked over 30% Per Game Servers at 750 FPS, I don't think I want to have to many of those running at one time lol.

- Travis

There has been a heavy discussion about this on the forums, and we concluded that FPS doesn't matter anymore (but it should ofc. be above or the same as the desired tickrate).
Slå den med jeres fiberforbindelser...

[Image: 1308107839.png]
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)