According to the Newzoo analyst firm, in 2019, the game development market grew by 9.6 % to more than $152 billion, which is more than the music and filmmaking industries combined. Today, more than 2.5 billion people launch video games each day on their various devices. Nearly half a billion users watch eSports streams on ESPN.
To stay competitive in a rapidly growing market and to stay ahead of even larger players, game studios need quality development tools, as well as reliable and low-cost infrastructure solutions.
In this article, we’ll talk about the main scenarios for using cloud technology to solve the requirements of the game industry.
Traditional IT architecture no longer meets the needs of advanced game companies. During the production and testing phase, large amounts of resources are constantly required.
Another issue is the speed of the required changes as each day, dozens of virtual machines are deployed and removed while environment settings are configured. For example, back in 2010, Wargaming (one of the world’s largest developers of multiplayer games, as well as the publisher of World of Tanks, World of Warships, and Master of Orion) noticed that while the full cycle of server acquisition was underway, competitors often had time to release new features and products. Additionally, work became more difficult during periods of peak infrastructure loads, such as during holidays or promotions.
Cloud technology became a way out of that situation. Today, Wargaming uses a G-Core Labs solution that combines a private and a public cloud. As a result, according to Wargaming, the time to prepare new releases has shortened severalfold.
Currently, it takes mere minutes for Wargaming to deploy server architecture. What’s more, there is no need to go through a complex approval process as all actions are performed via the self-service portal. In just a few clicks, one can create virtual machines with the required operating systems and settings. After the process is finished, they can be dismantled just as quickly. The customer can estimate the development and testing costs for each function, as well as the efficiency of business processes, using built-in reports.
Hundreds of designers working in game companies draw characters, create textures and locations, model light and smoke, and calculate gravity and complex movements. To allocate a high-performance graphics station for each of them is a rather expensive approach. Besides, the workload is usually distributed unevenly among employees, which reduces the efficiency of these resources.
The cloud provides an infrastructure with unlimited capacity for game design tasks.
Any Wargaming employee can send a request to use the computing resources of the G-Core Labs cloud, including to calculate graphics. At the same time, while working on a game project in the cloud, designers do not face such problems as performance gaps and long response times.
The publisher’s job is to provide near-instantaneous computation and prevent delays in content transmission and player actions. A set of services is enabled on the servers. They are responsible for supporting user accounts, managing game element databases, and, most importantly, computing the virtual world itself. Situations where hundreds of thousands of users simultaneously connect to the game are not uncommon. In such cases, the number of servers needed only for calculations related to game mechanics goes into the hundreds and thousands.
Cloud services used by leading gaming companies take this load on themselves. For example, Caliber, a game created by Wargaming in 2019, has recently started migrating to the G-Core Labs cloud where several project virtual machines have already been deployed.
The load on game company’s servers is distributed unevenly. User activity can be spurred by seasonal factors, holidays, or new releases. Marketing is a factor as well since promotions and sales dramatically increase the influx of new players. As a result, the load on the game company can grow several times in a matter of hours or minutes.
Thus, the cloud infrastructure of game developers, in addition to good performance, must provide a high level of resilience. To do this, the availability of cloud resources must be ensured above all else. Large providers allow their customers to deploy any amount of capacity in just a few minutes. The opposite is no less important as after the rush subsides, it should be possible to quickly turn off idle resources.
For example, when Walking Dead: No Man’s Land was released, more than a million users downloaded it in just the first weekend. The playtime during those days totaled about 31 million minutes. But the developer’s cloud infrastructure turned out to be unprepared for such a load.
In addition to the “good stress” associated with the influx of users, emergencies like equipment failure are inevitable. At the same time, the classic IT infrastructure requires serious investment to ensure reliability:
By using the cloud, the provider solves a significant portion of these issues. The game company doesn’t need to manage hardware. Instead, it’s enough to select the required service level parameters (SLA).
“The G-Core Labs public cloud supports taking snapshots of virtual machines. Customers can connect the DRaaS service with the required parameters of system recovery or data recovery time”
It’s not only the idea and the initial implementation that impact the success of any video game. Game companies also look at user behavior, their reactions, and their feedback. Major brands invest millions of dollars in platforms that allow for the analysis of data and prediction of player behavior.
However, even small development companies can compete with industry giants if they use clouds for these purposes. Provider services allow customers to enable Big Data analysis and machine learning algorithms. For example, in Titanfall, the cloud platform was responsible for training and improving titan behavior by analyzing data collected from around the world.
G-Core Labs also provides its customers with the ability to use an AI platform that supports quickly creating, deploying, and improving machine learning models.
The use of public clouds is changing the traditional approach to information security. On the one hand, such services require specific protection measures. On the other hand, clouds themselves can become an additional means to mitigate some risks.
One of the main threats to distributed networks is attacks that make their components unavailable. The use of DDoS technology, including at the application layer, can paralyze the operation of an external service for several days.
The G-Core Labs cloud has its own protection service that allows it to block malicious sessions based on the in-depth analysis of statistical, signature, and behavioral traffic parameters. As a result, almost all L3, L4, and L7 attack types are detected and stopped, including singular queries.
At the same time, a public cloud can become an additional security segment for the networks of game companies. For example, when organizing the work of external remote developers, it can be difficult to approve their access to the company’s internal resources. In this case, the problem is solved by creating a prototype with test data in the cloud. This environment is completely isolated and not connected to the company’s other IT resources.
“With the G-Core Labs cloud, creating an isolated environment for the company’s needs has become much easier and faster than obtaining accounts for traditional infrastructure”