Hi all,
As most of you may know already, i've been working on a space-based MMO game for some time now, with a custom networking solution built on top of Photon.
At the moment, the architecture of the networking solution is fairly simple. There are two Photon applications, one called MasterServer, the other called SubServer.
The master server is very simple. It stores information about sub servers, such as the load (cpu, memory and bandwidth based) and what the sub server is designed to handle. It basically acts as a router, routing peers to the correct sub server.
Sub servers are designed to handle one (or more) solar system(s). The basic idea is that you connect to the master server and tell it what solar system you want to go to. The master server looks to see if there is a sub server assigned to this solar system and sends you there. Otherwise it assigns the sub server with the least load to that solar system before sending you there.
Although my testing so far has shown that this works quite well and I predict that a fairly well-fitted server with the SubServer application running on it should be able to handle somewhere up to 200-300 peers, I worry if I ever need more than this... Wishful thinking, I know. However I would rather tackle this now than later.
So, my question is, instead of building my networking solution where we expect one server-computer to use the SubServer application, and act as a sub server, would I be better off building the networking solution to work within a cloud-based network? Cloud computing is fairly unknown territory for me.
Would it be the case that if I were to build a cloud-based solution, I would ditch the master/sub server model, and just build a single photon application that expects to handle all solar systems/peers etc, and expect the cloud to scale to the number of solar systems and peers using the application?
Many thanks in advance for any advice.
As most of you may know already, i've been working on a space-based MMO game for some time now, with a custom networking solution built on top of Photon.
At the moment, the architecture of the networking solution is fairly simple. There are two Photon applications, one called MasterServer, the other called SubServer.
The master server is very simple. It stores information about sub servers, such as the load (cpu, memory and bandwidth based) and what the sub server is designed to handle. It basically acts as a router, routing peers to the correct sub server.
Sub servers are designed to handle one (or more) solar system(s). The basic idea is that you connect to the master server and tell it what solar system you want to go to. The master server looks to see if there is a sub server assigned to this solar system and sends you there. Otherwise it assigns the sub server with the least load to that solar system before sending you there.
Although my testing so far has shown that this works quite well and I predict that a fairly well-fitted server with the SubServer application running on it should be able to handle somewhere up to 200-300 peers, I worry if I ever need more than this... Wishful thinking, I know. However I would rather tackle this now than later.
So, my question is, instead of building my networking solution where we expect one server-computer to use the SubServer application, and act as a sub server, would I be better off building the networking solution to work within a cloud-based network? Cloud computing is fairly unknown territory for me.
Would it be the case that if I were to build a cloud-based solution, I would ditch the master/sub server model, and just build a single photon application that expects to handle all solar systems/peers etc, and expect the cloud to scale to the number of solar systems and peers using the application?
Many thanks in advance for any advice.