Difference between revisions of "Cambridge Analytica and Facebook Scandal"

From Privacy Wiki
Jump to navigation Jump to search
 
(6 intermediate revisions by 2 users not shown)
Line 4: Line 4:
 
|Date=2016
 
|Date=2016
 
|Taxonomy=Aggregation, Secondary Use, Surveillance, Decisional Interference, Insecurity
 
|Taxonomy=Aggregation, Secondary Use, Surveillance, Decisional Interference, Insecurity
|Threat Actors=Facebook Inc., Cambridge Analytica Ltd., Trump campaign team
+
|Threat Actors=Cambridge Analytica Ltd., Trump campaign team, Facebook
 
|Affected Individuals=Users of Facebook, American voters
 
|Affected Individuals=Users of Facebook, American voters
 
|Secondary Consequences=Change of Behavior
 
|Secondary Consequences=Change of Behavior
|Summary=Cambridge Analytica combined personal information of Facebook users during the electoral processes in the Unites States in 2016 in order to build a program predicting and influencing choices at the ballot box.
+
|Summary=Cambridge Analytica combined personal information of Facebook users during the electoral processes in the United States in 2016 in order to build a program predicting and influencing user choices at the ballot box.
|Description=Cambridge Analytica Ltd was a British political consulting firm that developed a model to influence people's political choices through targeting advertisement on Facebook. The model was developed thanks to personal information of people on Facebook. Cambridge Analytica was hired by Trump campaign team in 2016.
+
|Description=Cambridge Analytica Ltd was a British political consulting firm that developed a model to influence people's political choices through targeting advertising on Facebook. The model was developed based off the personal information of people on Facebook. Cambridge Analytica was hired by the Trump campaign team in 2016.
  
The model Cambridge Analytica was able create through myPersonality app which contained a 100 questions quiz, was given to users as a personality test. Many respondents who took the quiz authorized Cambridge Analytica to gain access to their and their friends network profile data. This allowed the company assess person’s openness, conscientiousness, extroversion, agreeableness and neuroticism. Combining these different categories of personal information is as example of [[Aggregation]].
+
The model Cambridge Analytica used obtained data through the myPersonality app which contained a 100 questions quiz and was provided to users as a personality test. Many respondents who took the quiz authorized Cambridge Analytica to gain access to their data and their friends' network profile data. This allowed the company to assess a person’s openness, conscientiousness, extroversion, agreeableness, and neuroticism. Combining these different categories of personal information is an example of [[Aggregation]].
  
Using that model Cambridge Analytica then developed an app thisisyourdigitallife, similar to myPersonality, which was used to harvest data from more than 50 million Facebook profiles. Only 270,000 users authorized the app to have access to their data, and all were told that their information was being used for academic research. Using this data for targeting users with political ads and influencing them on their choices is an example of [[Secondary Use]].  
+
Cambridge Analytica then turned to Aleksandr Kogan, a psychology professor at Cambridge University, who developed an app thisisyourdigitallife, similar to myPersonality, which was used to harvest data from more than 50 million Facebook profiles. The users who authorized the app to have access to their data were told that their information was being used for academic research. Using this data for targeting users with political ads and influencing them on their choices is an example of [[Secondary Use]].  
  
The data that the app was able to get access to was users’ entire Facebook profile information as well as the users friends networks information. This included persons “likes” to watch and analyze one's preferences. Facebook watching user behavior through their “likes” can be seen as [[Surveillance]].   
+
The data that the app was able to get access to was users’ entire Facebook profile information as well as the user's friends' network information. This included a person's “likes”, to watch and analyze one's preferences. Facebook watching user behavior through their “likes” can be seen as [[Surveillance]].   
  
What allowed Cambridge Analytica harvest personal information of users on an unprecedented scale, was claimed (by media and by Facebook) as a data breach <sup>[[Insecurity]]</sup>. This was reported as the largest known leak in Facebook history. However, the company failed to notify users after they found out and didn’t apply necessary measures to recover and secure the private information of more than 50 million individuals.  
+
What allowed Cambridge Analytica to harvest the personal information of users on an unprecedented scale, was claimed (by media and by Facebook) as a data breach <sup>[[Insecurity]]</sup>. This was reported as the largest known leak in Facebook history. However, the company failed to notify users even after they found out about the breach and didn’t apply necessary measures to recover and secure the private information of more than 50 million individuals.  
 
    
 
    
Cambridge Analytica exploited this insecurity to harvest personal information of users in order to built models that would predict individuals’ choices. This would allow them to better target advertising to the users and influence their political choices during the presidential elections. This is an example of [[Decisional Interference]].  
+
Cambridge Analytica exploited this insecurity to harvest the personal information of users in order to build a model that would predict individuals’ choices. This would allow them to better target advertising to the users and influence their political preferences during the presidential elections. This is an example of [[Decisional Interference]].  
  
Cambridge Analytica said it had destroyed the user information it collected on Facebook. But raw data reviewed by the media suggests the information, or copies of it, may still exist.  
+
Cambridge Analytica said it had destroyed the user information it had collected on Facebook. But raw data reviewed by the media suggests the information, or copies of it may still exist.  
  
Cambridge Analytica actions and its connection with Trump campaign raised urgent questions about Facebook’s role in targeting voters in the US presidential election.  
+
Cambridge Analytica's actions and its connection with Trump campaign raised urgent questions about Facebook’s role in targeting voters in the US presidential election.  
  
The company has faced a backlash after the scandal.  
+
Both companies have faced a backlash after the scandal.  
In 2018 Facebook announced a forensic audit on Cambridge Analytica and banned myPersonality for improper data controls and suspended hundreds more apps.
+
In 2018 Facebook announced a forensic audit on Cambridge Analytica and banned myPersonality for improper data controls and suspended hundreds of other apps.
 
|Sources=https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html,
 
|Sources=https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html,
 
https://www.nytimes.com/2018/03/20/technology/facebook-cambridge-behavior-model.html,
 
https://www.nytimes.com/2018/03/20/technology/facebook-cambridge-behavior-model.html,

Latest revision as of 14:09, 24 July 2020

Cambridge Analytica and Facebook Scandal
Short Title Cambridge Analytica Compiled Facebook User Data to Target American Voters
Location Global
Date 2016

Solove Harm Aggregation, Secondary Use, Surveillance, Decisional Interference, Insecurity
Information
Threat Actors Cambridge Analytica Ltd., Trump campaign team, Facebook

Individuals
Affected Users of Facebook, American voters
High Risk Groups
Tangible Harms Change of Behavior

Cambridge Analytica combined personal information of Facebook users during the electoral processes in the United States in 2016 in order to build a program predicting and influencing user choices at the ballot box.

Description

Cambridge Analytica Ltd was a British political consulting firm that developed a model to influence people's political choices through targeting advertising on Facebook. The model was developed based off the personal information of people on Facebook. Cambridge Analytica was hired by the Trump campaign team in 2016.

The model Cambridge Analytica used obtained data through the myPersonality app which contained a 100 questions quiz and was provided to users as a personality test. Many respondents who took the quiz authorized Cambridge Analytica to gain access to their data and their friends' network profile data. This allowed the company to assess a person’s openness, conscientiousness, extroversion, agreeableness, and neuroticism. Combining these different categories of personal information is an example of Aggregation.

Cambridge Analytica then turned to Aleksandr Kogan, a psychology professor at Cambridge University, who developed an app thisisyourdigitallife, similar to myPersonality, which was used to harvest data from more than 50 million Facebook profiles. The users who authorized the app to have access to their data were told that their information was being used for academic research. Using this data for targeting users with political ads and influencing them on their choices is an example of Secondary Use.

The data that the app was able to get access to was users’ entire Facebook profile information as well as the user's friends' network information. This included a person's “likes”, to watch and analyze one's preferences. Facebook watching user behavior through their “likes” can be seen as Surveillance.

What allowed Cambridge Analytica to harvest the personal information of users on an unprecedented scale, was claimed (by media and by Facebook) as a data breach Insecurity. This was reported as the largest known leak in Facebook history. However, the company failed to notify users even after they found out about the breach and didn’t apply necessary measures to recover and secure the private information of more than 50 million individuals.

Cambridge Analytica exploited this insecurity to harvest the personal information of users in order to build a model that would predict individuals’ choices. This would allow them to better target advertising to the users and influence their political preferences during the presidential elections. This is an example of Decisional Interference.

Cambridge Analytica said it had destroyed the user information it had collected on Facebook. But raw data reviewed by the media suggests the information, or copies of it may still exist.

Cambridge Analytica's actions and its connection with Trump campaign raised urgent questions about Facebook’s role in targeting voters in the US presidential election.

Both companies have faced a backlash after the scandal. In 2018 Facebook announced a forensic audit on Cambridge Analytica and banned myPersonality for improper data controls and suspended hundreds of other apps.

Laws and Regulations

Sources

https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html
https://www.nytimes.com/2018/03/20/technology/facebook-cambridge-behavior-model.html
https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html
https://www.theguardian.com/technology/2019/mar/17/the-cambridge-analytica-scandal-changed-the-world-but-it-didnt-change-facebook