Servers -- A Clusterfuck of Sadness

I have always wondered why we choose to have multiple servers for our games. This thought mainly attaches to MMOs as they have a huge population and usually spread that population across multiple “named” servers for players to choose.


Why not a single server, now that we have the ability to create such things?

What is the benefit of having multiple servers?

Multiple servers do one thing, sever. They sever a population at a single fork in the road, no longer does the game retain “I play WoW” it now becomes “I play WoW on X server”.

You look at the trend and you see most major studios are moving away from multiple servers and towards a single server point for all of their players.

GUARDIAN, Think of the PvP. With a single server you would be forced into having instancing!

Is the MOST common complaint I hear. Why? Eve accomplishes having 500,000 players – 20-50k active on a single server without instancing. Why is a game that is greater than 10 years old able to accomplish this over a game that comes out today with the understanding of ALL of the tech that came before it?

My theory, money. It is a risky bet saying that people will want to play all on one server. Few companies are willing to take that plunge. Which is odd because, Blizzard, NCsoft, Bethesda and others are ALL moving towards this method of single server structure.

Eve is sharded. get more than 1000 in one area and its a shit show.

When thinking about this issue, it’s important to frame it properly. “Server” is a misnomer since every named realm has many servers that run the operations for that pool of the populations.

The primary benefit is that it takes the load balancing onus off of the technology. Instead, each realm (or group of servers) has a soft and hard cap which usually gets reflect in the UI (e.g. Medium, Heavy, Full). At some point creating a new character is simply blocked.

The other option is much more complex than you might imagine. Having one entry point and technologically load balancing areas within introduces the obvious load issues but also quite a lot of sync issues. You now need extremely quick asynchronous database systems.

The most resource intensive issue that affects performance is not the number of people running around the world or being signed in, it’s accessing a database when they do something in the world. This is even true for, going back to your comment about “How can it cost so much to run a forum?!@#” Well, it’s because everyone is accessing the database and loading things in real time asynchronously. When I update a post, you see it immediately.

Similarly, even when you are balancing load of people running around a particular world (say 10 people per Planet X), the entire collective still needs to communicate with a database and everyone across all shards must be in sync (for example, they may mail each other, chat with each other, get an achievement, etc.)

Breaking that up into many smaller pools of population makes it easier just like it would be easier to have, forum-2, etc.


So loading data costs money, got it.

How does a game like ESO pull enough money to operate servers on a system like they have? Is that why the game is in decline because there are so many requests on their database at a time? How much does it cost per request for a database?

Not just loading data, it’s about concurrent reading and writing.

Wow, so you are saying that even if I just read data there is a fee? Is this normal for all servers? Or only gaming based servers?

Stop thinking in terms of fees. Monetary cost is not the issue, it’s scale. Things that are expensive are things that are energy intensive, whatever the metric happens to be.

@therubymug could probably explain this really well in terms of why databases are “expensive” (in terms of scale and resource). Maybe he’ll chime in if I ping him like I just did :stuck_out_tongue:

Okay, so you are saying that a scalable architecture system is not expensive in terms of monetary cost but system resource cost?

So, essentially, it is cheaper to have 100 actual servers at varying capacity over fewer servers running at full capacity?


Servers have three main costs: processing power (CPU & memory mainly), storage, and network bandwidth.

Most web app hosting companies like DigitalOcean or Amazon’s AWS charge you by processing power first. So if you’re app is consuming a lot of memory and CPU time then you either get more powerful server or set up a cluster. Both of which cost more.

The key here is that you’re paying for the the processing power on these servers whether you use them to their full potential or not. So it’s not necessarily cheaper to add more servers. What most companies do is that they adjust the number of servers based on the traffic they’re getting.

Netflix is a perfect example of this. Check out these two articles:

Pretty cool stuff!