ALA: Facebook’s Data Mining
Saved under Features, Social
Tags: Health, Inequality, Living Standards, Markets, Moral Hazard, Science and Technology, Unsustainable Development
Facebook’s Virtual Reality Gamble Is (Again) About Data Mining
Oculus Rift acquisition will allow Facebook to track and monetize user behavior in yet another domain.
In Los Angeles, a journalist has built a virtual reality simulation that re-creates a night in 2010 when a Mexican immigrant was beaten and killed by the U.S. border patrol.
At a hacker convention in Las Vegas, an interactive theater troupe is running a game in which players wearing special headsets must navigate a maze while looking down at their own physical bodies from a bird’s eye view.
In Madrid, artists are using virtual reality technology for a project that allows participants to “switch bodies” with members of the opposite gender.
If you get the opportunity to use a virtual reality headset such as the Oculus Rift — and chances are you will, very soon — these are just a few of the intense, disorienting and profound experiences that await you. The Rift is shipping out to early adopters this month after a Kickstarter campaign raised money for the original prototype, and virtual reality enthusiasts see the technology as poised to finally enter the mainstream.
But for all its promise, the virtual reality revolution is haunted by the specter of its newest and most intimidating stakeholder: Facebook.
The social media giant’s $2 billion buyout of Oculus VR in March came as a shock to the gaming world. More than 9,000 fans had contributed $2.4 million to crowdfund the prototype device on Kickstarter, and around 12,000 people had already preordered the $350 development model. Facebook CEO Mark Zuckerberg said the Rift would enable “many other experiences” besides games and seemed to describe the move as part of the next evolution of communications platforms. “Virtual reality was once the dream of science fiction,” he wrote in a tantalizing blog post. “But the Internet was also once a dream, and so were computers and smartphones.”
The backlash was immediate. Most of Oculus’ crowdfunders assumed they were supporting a new gaming platform — not pumping its value for acquisition by the same company that lets “friends” spam you with invites to play “Farmville.” Markus “Notch” Persson, an early Oculus backer and creator of the hugely popular “Minecraft,” announced on Twitter that he was canceling the upcoming version of his hit game for the Oculus Rift. “Facebook is not a company of grass-roots tech enthusiasts … Facebook has a history of caring about building user numbers, and nothing but building user numbers,” he wrote.
The acquisition took on a more ominous tinge recently, when it was revealed that Facebook had secretly conducted a social experiment on hundreds of thousands of unwitting users. As part of a study in “massive-scale emotional contagion,” the company altered nearly 700,000 news feeds to see if doing so changed the mood of the content those users decided to post.
That Facebook performed the experiment without users’ consent and in conflict with its own terms of service was at once incendiary and unsurprising. Academics and commentators argued that the study was unethical, if notillegal. Kate Crawford, a prominent researcher at Microsoft, described it aptly as “a symptom of a much wider failure to think about ethics, power and consent on [online] platforms.”
To anyone familiar with its business model, the reason Facebook would want data on manipulating users’ emotions should be obvious. It’s the same reason it wants a hand in virtual reality platforms such as the Oculus Rift, and it’s also the reason the company has been moving in on social location apps, health tracking and the “quantified self” trend of turning one’s lifestyle into a report card of easily digestible numbers.
To Facebook, any and all domains of human experience should be accessible for capture and monetization.
This is particularly the case as Internet-connected devices such as Nest (a smart thermostat recently acquired by Google), FitBit (a wristband that tracks your exercise regimen) and Google Glass make their way into our homes and onto our bodies, recording everything from our pulses to the temperature of our living rooms. The Zuckerbergian view (and, more broadly, that of the Silicon Valley church of Big Data) is one of data totalitarianism: There is no dimension of human activity that shouldn’t be exploited, because where there are sensors, there’s data to be mined and money to be made.
By establishing itself in VR, Facebook is likely setting itself up to harvest and experiment with intimate data from that domain as well. Unlike a website, where user activity is measured in relatively primitive ways, such as clicks and scrolls, a simulated environment allows for much more sophisticated monitoring of a user’s every action. The head- and iris-tracking features that make gaming technologies like the Oculus Rift so immersive are treasure troves of data for a company like Facebook, giving it the ability to see exactly what you’re looking at, and for how long.
From there, it takes little imagination to envision how a corporation could manipulate users in ways that make Facebook’s timeline scandal look benevolent. Seeing someone’s Web search history is one thing; having access to her visual senses, even in a simulated space, suggests something closer to reading her mind. And unlike the things one looks at through wearable technology such as Google Glass, the code-based objects and places one interacts with in a virtual space exist — by default — as trackable data.
To be fair, as my former colleague Adi Robertson points out, we’ve been through much of this debate before. In the 1980s and ’90s, the first iteration of virtual reality triggered cultural anxieties about mind control and addiction, from the idea that VR would become “electronic LSD” to the fear that kids would be brainwashed or driven to violence by sex and murder simulations. But amid all this dubious conjecture, almost no one predicted the more tangible threat growing out of the Web’s business model: Big Data surveillance of user habits, patterns and preferences.
As is standard in the privacy debate, many will shrug their shoulders to such intrusions. So what if Facebook is watching as I load up a cliff-diving simulation, measuring my pulse as I fall off the edge, or if Google records my eye movements in “Grand Theft Auto” to help create targeted ads?
But there is an argument to be made that video games — like the Internet — should give us the opportunity to experiment with our concepts of self on our own terms.
In “The Machine to Be Another,” a group of artists uses the Oculus headset to allow participants to enter the bodies of strangers, especially those of the opposite gender. The creators describe it as an “art investigation” of “gender identity, queer theory, intimacy and mutual respect.” Another effort called “The Pretender Project” goes even further. It combines VR headsets with motion cameras and a form of low-voltage shock therapy to imagine a world where users can enter and control another’s body — a high-tech approximation of Spike Jonze’s 1999 film “Being John Malkovich.”
By enabling anonymous interactions and letting us try on different characters or personae, games empower us to explore who we are and what we’d like to become. They provide a safe space where we can run our own experiments, interrogating our identity and reality away from prying eyes. But how safe can these spaces be if they are under the influence of data-driven corporations? Will the social media companies shepherding us toward named online identities that carry our permanent records link those names to every voluntary and involuntary action we take in simulated spaces as well, potentially adding to the troves of personal data that we now know governments regularly access with the companies’ assistance?
There’s no way to know for certain; Facebook hasn’t revealed much about its plans beyond the hazy vignettes offered by Zuckerberg. Oculus CEO Palmer Luckey has tried to reassure fans that “you won’t need to log into your Facebook account every time you wanna use the Oculus Rift.” But this seems like vouching for the fox that guards the henhouse. While Facebook has so far been relatively hands-off with its other recent acquisitions, such as the messaging service WhatsApp, a cursory look at its history of bait-and-switch privacy debacles strongly suggests the Oculus Rift could become another way for the company to do what it has historically tended to do. Besides, for Facebook or another data-driven company not to use such a platform to track user behavior wouldn’t just be out of character; it would be a bad investment.
Once again, the result for consumers is that they may be asked to pay twice for new technology: first with their money, then with their data.
The lesson of Facebook’s social experiment isn’t that Facebook is uniquely evil; it’s that any company whose modus operandi involves collecting vast amounts of data will use that data however it sees fit — often without users’ knowledge or consent. The excitement surrounding the Oculus Rift promises a revolution in how we interact, empathize and connect with one another. But to make those connections truly revolutionary, we need them to run on open platforms, free from the influence of corporations that have a strong incentive to surveil our experiences and twist them to their own ends — behind closed doors and inside black boxes.