Home United States USA — software The Other Cambridge Personality Test Has Its Own Database With Millions of...

The Other Cambridge Personality Test Has Its Own Database With Millions of Facebook Profiles

303
0
SHARE

Unless you’ve been hiding under a rock on Mars, you’ve likely heard about a little scandal involving Cambridge Analytica and Facebook. Cambridge Analytica got its hands on millions of people’s Facebook likes in 2014 by getting an academic, Aleksander Kogan, to design an app with a personality test that hoovered up data from the 250,000 or so Facebook users that took it, as well as from their millions of
Unless you’ve been hiding under a rock on Mars, you’ve likely heard about a little scandal involving Cambridge Analytica and Facebook. Cambridge Analytica got its hands on millions of people’s Facebook likes in 2014 by getting an academic, Aleksander Kogan, to design an app with a personality test that hoovered up data from the 250,000 or so Facebook users that took it, as well as from their millions of friends. Cambridge Analytica supposedly used all those likes combined with the magic of big data to put Donald Trump in the White House.
Now that Facebook CEO Mark Zuckerberg has announced in response to this a plan to “investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014,” it’s a good time to look more closely at the project that inspired Kogan and Cambridge Analytica. This whole thing wasn’t their idea, after all; they copied it from the University of Cambridge, where Kogan had been a lecturer. The U. K. university’s psychometrics department had its own personality test which had been hoovering up Facebook users’ data since 2007, but, as the New York Times reported, and Kogan recently confirmed in an email, it refused to sell the dataset to the entity that became Cambridge Analytica (inspiring them to replicate the experiment).
The University of Cambridge’s personality test is still around, though, along with a database with profile information for over 6 million Facebook users. It has those users’ psychological profiles, their likes, their music listening, their religious and political views, and their locations, among other information. It says it can predict users’ leadership potential, personality, and “satisfaction with life.” You are supposed to be an academic to get access to all this, but the project’s page says the database has also been used to personalize Hilton’s apps; to recommend jobs to people; to target ads; and to create an interactive promo for the video game Watch Dogs 2 . (Ironically, that video game is about being surveilled by companies and Big Brother.)
The app and its database got attention a few years ago for being a very accurate way to predict Facebook users’ personalities, but it didn’t get much scrutiny beyond that; the Washington Post summed it up as “ a good party trick .” But this project laid the groundwork for the Facebook-Cambridge Analytica scandal to come.
The most recent paper from the two primary academics behind the project, David Stillwell and Michal Kosinski (who is now at Stanford), is titled, “ Psychological targeting as an effective approach to digital mass persuasion .” In it, they detail how they used the data about likes from their personality tests to predict which women on Facebook were more extroverted or more introverted, so that they could more effectively sell them beauty products. “Together, the campaign reached 3,129,993 users, attracted 10,346 clicks, and resulted in 390 purchases on the beauty retailer’s website,” write the researchers. “Users were more likely to purchase after viewing an ad that matched their personality.”
Beyond the ethics of capturing the Facebook profiles of millions of users by offering them a personality test, the commercial use of that data contradicts an impression that the database was intended for academic purposes only. Michal Kosinski did not respond to an inquiry about this, but David Stillwell did. Stillwell first launched the app as a side project called “myPersonality” before he started graduate school. He told me by email that this latest paper was necessary “once the revelations of the Trump campaign’s methods came out” and “clear academic questions [were] being answered, as well as answering an important societal question.”
“In running these studies, no profit was made by the Psychometrics Centre and none of the researchers were paid for their involvement,” he wrote by email. “The Psychometrics Centre had full control of the ads run to carry out the research and of the experimental design.”
So they did help a beauty company sell products but they didn’t profit from it.
According to an email Aleksander Kogan wrote about the inception of Cambridge Analytica, Stillwell declined to give his data to SCL Group—which formed Cambridge Analytica—because when Stillwell “collected the data, he told his users it was going to be for academic research.” Stillwell and Kosinski had still planned to help SCL Group with personality predictions but were “removed from the project because of a disagreement on monetary compensation,” according to Kogan’s email, which was published in The Outline .
Stillwell and Kosinski’s persuasion paper, published in November 2017, acknowledged the political scandal embroiling its bastard child Cambridge Analytica, without mentioning that it had inspired that firm’s techniques. “The capacity to implement psychological mass persuasion in the real world carries both opportunities and ethical challenges,” the academics wrote. “In fact, recent media reports suggest that one of the 2016 US presidential campaigns used psychological profiles of millions of US citizens to suppress their votes and keep them away from the ballots on election day. The veracity of this news story is uncertain. However, it illustrates clearly how psychological mass persuasion could be abused to manipulate people to behave in ways that are neither in their best interest nor in the best interest of society.”
As Facebook reevaluates apps that got their hands on tremendous amounts of Facebook data, this project deserves special attention. While Facebook is the greatest ongoing social science experiment of all time, and that data is certainly valuable to academics in many ways, did the users involved truly and knowingly consent to participation in studies into whether they get more jealous after they change their Facebook status to “In a Relationship”? Facebook had nothing specific to say about the University of Cambridge database, a spokesperson simply pointing me to their “ Hard Questions” post from Wednesday.
A wiki for the project has the consent forms that users would have seen at the time. A disclaimer says that a user can remove their data from the application by stopping its use, but because the original app is no longer available, and the users are anonymized, that is no longer the case.

Continue reading...