Game industry leaders and game developers have an opportunity and responsibility to create more secure games that protect users and continue to advance the industry. Fortunately, new approaches, techniques, and tools can help build security into games and prevent hackers from doing harm.
The mobile gaming market is one of the fastest growing industries, and for the first time, market watchers forecast it to generate more than half of the global gaming market revenue in 2018. In the first half of 2018 alone, a total of 19.5 billion mobile games were downloaded across both Google Play and the Apple App Stores. As the industry expands, it’s becoming an even greater target for hackers, particularly when a company is not careful about disclosing vulnerabilities.
Most recently, Epic released the Android version of Fornite independently, rather than through the Google Play Store — therefore avoiding Google receiving any of the in-app revenue. The cons, however, outweighed that pro by putting Android users at a greater security risk.
Google ended up discovering a vulnerability within the Fortnite installer app, which could enable hackers to replace the legitimate software with a fake, malicious version mid-download. Epic released a patch and requested that Google wait 90 days before notifying users to give them enough time to implement the updates.
Google, however, followed its own guidelines, informing users right away. By exposing a security vulnerability before the Fortnite developers had sufficient time to roll out the updates, hackers had a chance to exploit the blueprints, breaching personal data and adding further malicious apps.
According to statistics from NowSecure, the games industry is one of the worst offenders for security rankings in mobile apps. To compound the issue, game cultures tend to be fertile grounds for hackers. Oftentimes, hackers do not see a hard boundary between hacking a game for advantage in that game and intended gameplay. And as the industry has transitioned away from hardware-based video games to mobile apps with micropayments — where with every game there is personal account information, transactions and marketplaces — there are now digital assets with much more financial incentive for malicious hackers to prey upon users.
Another issue is that games companies can be more concerned with digital rights management than with fixing the core vulnerabilities in individual games. This is often because the games were created by programmers in their living room and later bought by the larger gaming company. In those cases, distributed DevSecOps isn’t always an option for them. Games companies need to figure out how to reign in the cowboy programmers and check for a clean bill of health before pushing out their apps.
Video game companies should concentrate on making the best risk decisions possible, which means:
Make the best decisions available. This means conducting thorough threat modeling on game design and features; making continuous efforts to discover what risks exist in their applications and infrastructure; and using engagement and performance analytics to gain visibility into how and when they are being attacked.
Choose knowledgeable people to take in those signals collected, analyze what’s working securely and what’s not working well and partner with developers to create a process that reduces risk over time, not just for one release or in the short term.
Pay attention to security from the beginning of the game development process. If your organization does not have a specific methodology for developing secure applications, then you are developing insecure applications, full stop. A DevSecOps approach integrates security throughout the entire development process. Designers, architects and developers should be required to build security into the overall design of a game, within individual functional components, and to build specific abuse cases into their testing program. This can save companies from panicking as they near their release date and try to add security late in the development process when it can be less effective and more time consuming, or worse, when it’s already in the hands of users, and too late.
First, exercise reasonable disclosure of vulnerabilities.
The wait time for responsible disclosure is debated to lie within 90 days to 180 days. If a vendor fails to respond within that last generous timeframe, then there is some public opinion that the finder has a social responsibility to alert the public to the dangers inherent in using a product, application, software or service. Finders often report to CERT groups or the vendor directly, especially if there is a bug bounty program. To minimize the risks for its users, vendors should take steps immediately to create the necessary patches and notify the public once a solution is ready for users.
Second, pushing live updates is preferable .
The failure of Epic Games to have a process to push and inform users of updates within the app is not necessarily Google’s fault. Many mobile apps sold outside of the game store and app stores do not have the mechanism for pushing live updates, and often have a post-release process created to deal with bugs. This is, of course, subpar, as it leaves users open to vulnerabilities. Where possible, push live updates to make it easier on the user and ensure patches are installed. When updates cannot be pushed, be prepared to make hard choices about what versions of the client application you will no longer support or allow to connect to your backend in order to apply hard pressure to users who have not updated.
Game industry leaders and game developers have an opportunity and responsibility to create more secure games that protect users and continue to advance the industry. Fortunately, new approaches, techniques, and tools can help build security into games and prevent hackers from doing harm.
Zach Jones is a former golf professional who is now part of WhiteHat Security’s Threat Research Center .
Home
United States
USA — software How game companies can boost cybersecurity and avoid Fortnite’s fiasco