https://dustri.org/b/on-non-technical-video-games-cheat-mitigations.html Artificial truth archives | latest | homepage The more you see, the less you believe. On non-technical video-games cheat mitigations Fri 12 January 2024 -- download Cheats are as old as video games, and will be there as long. There are a couple of high-profile players in the anti-cheat market today: BattlEye, Valve's VAC, PunkBuster, Epic's EAC, Blizzard's Warden, Riot's Vanguard, Activision's Ricochet, ... as well as in-house ones. To try to keep up in the race, both sides are resorting to more and more invasive technical privacy-invasive measures: streaming virtualised shellcodes, hardware fingerprinting and locking, stack-walking, bootkit-like kernel drivers, TPM/ secure boot/ HVCI/ IOMMU/ VBS/... shenanigans, hypervisors detection/usage, exfiltration of suspicious materials, hardware fingerprinting, external DMA hardware, or other more exotic things. Yet anti-cheats are still routinely bypassed, less in a public manner, granted, but private and closed-community cheats are still flourishing, since it's a losing game by nature. And since games and anti-cheats are software, they're of course riddled with hilarious bugs leading to stupid bypasses. But this isn't what this blogpost is about. Nowadays, cheats are considered as part of a larger problem: abuses and toxicity. Cheats aren't (only) hunted down because they're morally questionable, but because they disturb the way the game is meant to be enjoyed. Toxic and abusive behaviours lead to the very same results: A game that isn't fun to play because of cheating/abuse/toxicity issues will see its players number decrease, have poor reviews, ... and won't make money. I'm sure there is a parallel to be made about the current state of our society, but I digress. For this article, we'll consider cheating and abuse/toxicity as a single issue under the term abuse. Now, because abuse isn't a purely technical issue, but also a social one, it can't be solved by technical solutions only, so let's have a look at what non-technical mitigations game developers are coming up with to curb this issue. The most obvious mitigation is to make cheating expensive, money wise. Having to pay 60EUR for a game is a steep investment, especially if one has to buy it again every time they get banned. This of course doesn't apply for free-to-play games, but can be emulated by having a cosmetics ecosystem, either to pay for, or to grind. The other expensive thing when playing video games is the hardware, and bans can be tied to it. Global measures The big mitigation at this level is reputation systems. They're based on people who know best how a fun and fair game should go: players. After a match, they're encouraged to cast votes on how fair it was, on a match level, but also directly at players level: "Bob was really looking out for others", "Bob was a team player", and so on. For negative behaviour, reports don't have to wait the end of the match, players can report cheating, being offensive in the text/voice chat, griefing, queue dodging, smurfing, ... Of course, slanderous reports are penalised. Peer pressure is a good lever too, by taking action not only against cheaters, but from people benefiting from the cheat, like regular teammates. Bug bounty programs are now commonplace, so it's only logical that there are now some rewarding anti-cheat bypasses/exploits. The rewards are a bit cheap for now, but will likely rise up as the programs mature. The positive effects are multiples: 1. It increases the incentives to report issues to get them fixed: a player finding a glitch/exploit can now get some cash for the discovery 2. As more abuse vectors are killed, the reward prices will rise, and it might become more profitable to report bugs than to sell them to cheat providers. This isn't unheard of, with Google's kernelCTF paying two times more than Zerodium. 3. If the bug bounty program is correctly managed, the probability of getting a given amount of money for reporting an issue will be higher than using it in a cheat for an unknown period of time until it gets fixed. 4. It will likely increase the amount of people looking for issues and willing to report them. Community managers can also regularly [S:spread FUD:S] post updates about ban waves, anti-cheat measures, reports, ... to make it clear that abusive behaviours are something being taken care of, and a dangerous gamble for players to take part in. I think I have seen some people spending time proving that some cheaters streaming live were in fact recycled pre-recorded footage from an earlier version of game, because some of the game details have been updated in the meantime. Accounts-level measures Some game stores, like Steam, have an account-level "cheater" mark, meaning that if someone gets banned from a game for cheating, other games can know about it. But more importantly, achievements and cosmetics are also tied to an account, and as mentioned previously, those are non-zero time and/or money investments. Getting banned means losing them. This of course only deters opportunistic cheaters, as people can simply create other accounts to cheat, but this can be made harder via purely technical means. Most competitive online games have ranked and casual game modes, with the former being only accessible after having spent a certain amount of time in the latter one. Meaning that that one has to do it again every time they get banned, or pay someone to do it. Some studios are even making player go through more hoops to be able to play, like requiring MFA, or playing a couple of matches against bots branded as a tutorial, before being able to play with other people. There is a course a fine balance to keep to annoy abusers but not legitimate players. Player-level measures The goal of non-technical measures isn't to make it impossible to be abusive, but to make it not worth it. Moreover, issuing instahwpermabans to edgelords seems a tad heavy-handed, so having a large panel of measures against abuser makes sense: one might want to allow people to rectify their behaviour, to isolate them to cool down, and so on. It might include textual warnings, temporary bans, kick from the current game, chat/voice mute, losing access to ranked play, reducing the amount of earned experience points, ... Players are abusive for various reasons, but I'd argue that most do because it's fun. Ruining the fun for them is thus a good way to curb such behaviours. A simple way to do this is to make them play together, by grouping players by reputation, or by having servers with technical anti-cheat measures explicitly disabled. But there are even more creative measures, like disabling their parachute, reducing their damage output to ridiculous levels, taking away their weapons, making other legitimate players invisible to them, randomly drop some of their inputs, hallucinations, ... and while this costs a bit more engineering time than simply grouping them together, it has a couple of high-value returns on investment: - allowing game developers to spend more time collecting data on how cheats are working on a technical level, - reducing the impact cheaters have on a game make is possible to significantly defer banning them without impacting other players too much, making it harder for cheat makers to pinpoint how and why a cheat was detected. - it's absolutely hilarious Examples Rainbow Six Siege * It uses BattlEye, and in end-2022 early 2023 banned around 5000 accounts per month, which is a lot, but also shows that it doesn't deter cheaters. * The game costs $8, but if you want to have access to all the operators, it's $70. One can also unlock operators by playing, which takes several hundreds of hours. * To play ranked, one need to reach level 50, which takes around 50h, give or takes. * The game has a rich ecosystem of cosmetics than can be purchased for steep prices, and painstakingly earned by playing, that would be lost in cast of an account ban. * Friendly fire will result in the damages being applied to the shoot should it be reported as voluntary by the player at the receiving end. * It's developing a pretty involved reputation system, where people with a "positive" behaviour gets rewarded (more experience points, cosmetics, ...), while those with a "negative" one might be prevented from playing ranked, get less experience points, ... Call of Duty: Modern Warfare 2: * The game costs $20, but was released in 2009. * "Players must be at least Level 16 to access Ranked Play", but this can be done in a couple of hours. * Cheating results in account-wise permaban across all Call of Duty titles. * Banned accounts have their records purged from leaderboards. * Players engaging in "negative" behaviours might get muted on chat /voice, ... and interestingly, cheaters are going to get paired with other cheaters in matchmaking. Players who are often playing with the same cheaters (boosting), will also get their reputation tanked. Valorant Its developer even published a great series of blopost on what it calls "game health" * The game is free-to-play, but comes with a lot of cosmetics. * Cheaters get a permaban, but people benefiting from them might get a 6 months one as well. * Players joining games and idling to reap out experience points, doing nothing but kneecapping their team will get penalised. * Players are encouraged to report toxic behaviours, and to not engage, since engagement might be penalized as well * Players using, certain words whether in chat or as username, will be flagged as toxic. * Penalties come in various size, shapes and durations, allowing to fine tune according to behaviour: warnings, voice/chat restrictions, reduction in experience points gain, reduction in raked rating, increased queue waiting time, ranking game ban, global ban. * Valorant published their approach to mitigate smurfing; acknowledging that while having multiple accounts to smurf/trade/ evade bans/... is not desirable, some people are using them to to play with friends with a better/worse ranked level. So while they took measures to detect and mitigate having multi-accounts, they also relaxed the maximum ranks difference for players to play together, which significantly reduced the number of alt-accounts usage, but also didn't alter match fairness in a measurable way. Conclusion This is all nice and dandy, but is it working? According to data from Rainbow Six Siege: Valorant, Call of Duty: Modern Warfare 2, ... those measures are indeed working pretty well, and are likely providing better results than technical-only measures. They are also cheaper, since steering people away from toxic behaviours doesn't reduce the number of players as much as banning them outright. It's nice to see that the video game industry realised that cheating and abuses/ toxicity could be addressed in similar non-technical ways, and that both approaches are complementary. This is a stark contrast to other ones, where techno-solutionism is seen at the only possible remedy, even more so in our machine-learning-all-the-things era. Sources and resources * Anti-Cheat for Multiplayer Games * Secret Club * UnKnoWnCheaTs --------------------------------------------------------------------- 2011-2024 - Julien (jvoisin) Voisin - CC BY-SA - atom/rss/twitter/ mastodon -