The Unnerving Rise of Video Games that Spy on You

Players generate a wealth of revealing psychological data—and some companies are soaking it up.
Collage of images of video game controller vector map of hand and digital eye
Photo-Illustration: Sam Whitney; Getty Images

Tech conglomerate Tencent caused a stir last year with the announcement that it would comply with China’s directive to incorporate facial recognition technology into its games in the country. The move was in line with China’s strict gaming regulation policies, which impose limits on how much time minors can spend playing video games—an effort to curb addictive behavior, since gaming is labeled by the state as “spiritual opium.”

The state’s use of biometric data to police its population is, of course, invasive, and especially undermines the privacy of underage users—but Tencent is not the only video game company to track its players, nor is this recent case an altogether new phenomenon. All over the world, video games, one of the most widely adopted digital media forms, are installing networks of surveillance and control.

In basic terms, video games are systems that translate physical inputs—such as hand movement or gesture—into various electric or electronic machine-readable outputs. The user, by acting in ways that comply with the rules of the game and the specifications of the hardware, is parsed as data by the video game. Writing almost a decade ago, the sociologists Jennifer R. Whitson and Bart Simon argued that games are increasingly understood as systems that easily allow the reduction of human action into knowable and predictable formats.

Video games, then, are a natural medium for tracking, and researchers have long argued that large data sets about players’ in-game activities are a rich resource in understanding player psychology and cognition. In one study from 2012, Nick Yee, Nicolas Ducheneaut, and Les Nelson scraped player activity data logged on the World of Warcraft Armory website—essentially a database that records all the things a player’s character has done in the game (how many of a certain monster I’ve killed, how many times I’ve died, how many fish I’ve caught, and so on).

The researchers used this data to infer personality characteristics (in combination with data yielded through a survey). The paper suggests, for example, that there is a correlation between the survey respondents classified as more conscientious in their game-playing approach and the tendency to spend more time doing repetitive and dull in-game tasks, such as fishing. Conversely, those whose characters more often fell to death from high places were less conscientious, according to their survey responses.

Correlation between personality and quantitative gameplay data is certainly not unproblematic. The relationship between personality and identity and video game activity is complex and idiosyncratic; for instance, research suggests that gamer identity intersects with gender, racial, and sexual identity. Additionally, there has been general pushback against claims of Big Data’s production of new knowledge rooted in correlation. Despite this, games companies increasingly realize the value of big data sets to gain insight into what a player likes, how they play, what they play, what they’ll likely spend money on (in freemium games), how and when to offer the right content, and how to solicit the right kinds of player feelings.

While there are no numbers on how many video game companies are surveilling their players in-game (although, as a recent article suggests, large publishers and developers like Epic, EA, and Activision explicitly state they capture user data in their license agreements), a new industry of firms selling middleware “data analytics” tools, often used by game developers, has sprung up. These data analytics tools promise to make users more amenable to continued consumption through the use of data analysis at scale. Such analytics, once available only to the largest video game studios—which could hire data scientists to capture, clean, and analyze the data, and software engineers to develop in-house analytics tools—are now commonplace across the entire industry, pitched as “accessible” tools that provide a competitive edge in a crowded marketplace by companies like Unity, GameAnalytics, or Amazon Web Services. (Although, as a recent study shows, the extent to which these tools are truly “accessible” is questionable, requiring technical expertise and time to implement.) As demand for data-driven insight has grown, so have the range of different services—dozens of tools in the past several years alone, providing game developers with different forms of insight. One tool—essentially Uber for playtesting—allows companies to outsource quality assurance testing, and provides data-driven insight into the results. Another supposedly uses AI to understand player value and maximize retention (and spending, with a focus on high-spenders).

Developers might use data from these middleware companies to further refine their game (players might be getting overly frustrated and dying at a particular point, indicating the game might be too difficult) or their monetization strategies (prompting in-app purchases—such as extra lives—at such a point of difficulty). But our data is not just valuable to video game companies in fine-tuning design. Increasingly, video game companies exploit this data to capitalize user attention through targeted advertisements. As a 2019 eMarketer report suggests, the value of video games as a medium for advertising is not just in access to large-scale audience data (such as the Unity ad network’s claim to billions of users), but through ad formats such as playable and rewarded advertisements—that is, access to audiences more likely to pay attention to an ad.

These advertisements serve numerous ends, such as facilitating user acquisition (ads for other games or apps), and increasingly, brand advertising. Similar to the approach of digital advertising giants Google and Facebook, where the data generated by platform users (clicks, swipes, likes, dislikes, purchases, movements, behaviors, interests, and so on) supposedly facilitates the placement of advertisements in front of the “right” audiences (as Unity’s executives note in a transcript of a recent quarterly earnings call), video game companies are attempting to harness the billions of interactions that take place within their games to create new revenue streams. These companies sell the eyeballs (and perhaps fingers, with playable ads) of their users to advertisers and mobilize data to best match users with advertisers based on the specifications of the advertiser or the software working on the advertiser’s behalf.

The data-richness of video games has also had an impact beyond the video game industry’s attempts to shape player attention. The logic of games is used to gamify functions and derive information that might not have been otherwise volunteered. Indeed, Yee and colleagues’ study of World of Warcraft player motivation frames the value of correlating sentiment or personality with user activity around the growth of gamification in society. To better understand how and why people play games in certain ways is, as the authors suggest, to better understand how to make gamelike interfaces beyond the context of gaming more compelling.

For instance, the Go365 health insurance app solicited information from users—such as blood glucose levels, sleep cycle, diet, whether they drink or smoke, or wider family medical histories— using gamification logics of points and rewards to develop (more profitable) personalized insurance profiles. These identified categories of risk that preclude some from certain kinds of insurance or drive up their premiums.

A 2017 article in The New York Times revealed that Uber’s driver interface used gamification techniques like rewards and points to create a “perfectly efficient system” where driver supply can meet rider demand. Crucially, the article revealed that Uber—through employing both social and data scientists—optimized these systems for compelling continued labor sustaining the platform.

Beyond gamification techniques optimized using data, we are beginning to see the use of gamification techniques to generate data about worker performance. Amazon’s warehouses are reportedly beginning to gamify labor to further make workers keep to “Amazon pace” (somewhere between walking and jogging)—a move that quite literally resembles the plot of an episode of the dystopian television show Black Mirror. As The Washington Post has reported, high performance in these (currently optional) games—with titles like MissionRacer, PicksInSpace, Dragon Duel, and CastleCrafter—can be exchanged for “Swag Bucks, a proprietary currency that can be used to buy Amazon logo stickers, apparel or other goods.” Under the guise of gamification, it is not a stretch to imagine how workers may be further disciplined through more invasive data-veillance in order to intensify their productivity at the expense of their welfare.

Because video games are systems that translate human inputs into machine-readable data, they have been afforded important status in driving so-called Silicon Valley innovation. One area has been the application of games in the development of AI. In a kind of one-upmanship of chess-playing algorithms, Alphabet’s AlphaStar AI and OpenAI’s OpenAI Five were trained to play the strategy games Starcraft 2 and Dota 2, respectively—famously, besting some of the world’s top players. To do so, these AI were trained using techniques like reinforcement learning, where essentially the AI played matches against itself—churning through thousands of years’ worth of gameplay (and learning from this data) within months.

For these companies, learning to play video games at a high level isn’t the end goal. For a company like OpenAI, training on Dota 2 has applications to physical robotics. Darpa—the Department of Defense’s research and development arm—has sponsored efforts to use games to develop AI for military application. Gamebreaker—a project that engages both academia and industry (including defense and arms contractors like Lockheed Martin and Northrop Grumman)— aims to use video-game-playing “AI to exploit engagement models … to enable intelligent systems that could in turn enhance military strategy.”

Training AI on complex games—games that take humans thousands of hours to master—also serves to drum up support for AI, selling it to investors, policymakers, and publics as something credible amid growing criticism about exaggerated (or outright fraudulent) claims of its efficacy and veracity. If we are told that AI can master Starcraft, then it might make us feel a bit better about the prospect of AI driving a car, assessing debt, and so on.

More speculatively, video games are aligning with the development of new forms of embodied computing interfaces, a training ground for technologies such as brain-computer interfaces (BCI)—augmenting brain capabilities with computation. Valve Corporation founder Gabe Newell, whose company an early adopter of BCI in the video game industry, suggests that BCI enabled games, built into things like future VR headsets, could well track data points telling us whether people are happy, sad, surprised, or bored. Recently, The Financial Times reported on a series of patents granted to Meta that suggest that future augmented and virtual reality headsets (where the company sees gaming as one major application) may use biometric data (such as gaze, in one patent) leveraged for purposes such as advertising. In this sense, not only can games be used to make inferences about us from our choices, the value proposition of forms of embodied computing interfaces from VR to AR to BCIs is to provide access to the physiological processes underpinning those choices.

Consternation about digital technologies harvesting our data is increasingly common. Video games are by no means free of critique (see, for instance, concerns around inciting violent behavior, or exposure of children to gambling-like practices), but they have been less afflicted by these critiques regarding data and privacy. While critique of the CCP’s use of games to collect biometric data represents an awareness of how video games might enact surveillance, it is only one such example. We need to think critically and lucidly about video games as mechanisms for extraction and accumulation.

This is not to say that we should resign ourselves to the fact that many games vacuum up our data. Platform alternatives to game development such as Twine or Bitsty show that it’s possible to resist the data capture imperative of engines like Unity. As the video game researcher Aleena Chia and her colleagues have noted, where Unity’s large user-base serves to power the company’s ad network, software like Twine use its network of users to create a collaborative community developing games that “directly contraven[e] accepted videogame conventions.” As these platform alternatives show, what we need is to renegotiate the terms on which we play.


More Great WIRED Stories