TestEventQuery
Jump to navigation
Jump to search
Query Information and Consequences for"Instagram Sliders"
Information | Consequences | |
---|---|---|
Instagram Sliders | Preference Social Network | Changed Behavior Ostracism Embarrassment |
Query Information and Consequences for everything
All fields in the table
Short Title | Location | Date"Date" is a type and predefined property provided by Semantic MediaWiki to represent date values. | Taxonomy | Personal Information | Threat Actors | Affected Individuals | High Risk Groups | Secondary Consequences | Summary | |
---|---|---|---|---|---|---|---|---|---|---|
"Cerebral's FTC Settlement: A Landmark Case on Consumer Privacy and Data Misuse in Telehealth" | Cerebral FTC Settlement: Privacy Violations in Telehealth | United States of America | 2024 | Secondary Use Interrogation Disclosure Exposure | Demographic Names Addresses Email addresses PHI Date of Birth | LinkedIn TikTok Snapchat Former employess Cerebral | Patients seeking mental health services from Cerebral. | Individuals with sensitive medical conditions Individuals Whose Prescription Information Was Shared Individuals Concerned with Anonymity | Anxiety Loss of Trust Loss of Privacy Health Risks Privacy Violations | Cerebral, a telehealth provider for mental health, had to face huge litigation for its violation of the law on the guarantee of privacy. The FTC ordered Cerebral to pay more than $7 million for charges filed against the organization for the unauthorized sharing of sensitive data belonging to approximately 3.2 million users with third-party advertisers, including but not limited to TikTok and Snapchat. This sensitive information included names, birth dates, addresses, medical information, and insurance details. Cerebral was also accused of letting former employees have access to medical records and failing to protect user data. Besides the monetary punishments, the settlement required operational modifications by Cerebral: deletion of unnecessary data and institution of a new data retention schedule. Furthermore, the FTC ordered Cerebral to provide easier ways for customers to cancel services and to permanently bar it from using health information for marketing. It is a landmark case that has really driven home the need for compliance with privacy laws when handling sensitive health information. |
1Health.io and it's Inadequate Protection of Genetic Data and Unfair Privacy Policy | FTC charges 1Health.io for Consumer Protection and Privacy | 16 June 2023 | Insecurity Disclosure Increased Accessibility | 1Health.io | Consumers of 1Health.io Inc. | Consumers | Employment Loss of Trust Change of feelings or perception | 1Health.io is charged with the complaint that the company misled its consumers about how it was handling and protecting sensitive, personal genetic information. This led to the unfolding of their unfair practices under Section of the FTC Act. | ||
2018 Marriott Breach | Marriott Data Breach 2018 | Global | 30 November 2018 | Information Processing INFORMATION DISSEMINATION Information Collection Invasion | Names Addresses Passport numbers Email addresses Payment card information | Unidentified hackers | Approximately 500 million Marriott customers who had stayed at Starwood hotels and resorts | Individuals who had sensitive personal information exposed | Potential for identity theft Financial fraud | A security breach impacting Marriott International's Starwood guest reservation database was disclosed in November 2018. About 500 million customers' personal information, including names, addresses, passport numbers, email addresses, and some payment card information, was exposed as a result of the breach, which has been going on since 2014. Unknown hackers were responsible for the intrusion. Investigations under the GDPR and different state data breach notification regulations in the US resulted from it. Along with the reputational impact to Marriott, the incident could have led to financial fraud and identity theft. |
45 Million Medical Files Leaked | Leaky Databases Expose over 45 Million Medical Images and Patient Data | Global | December 2020 | Insecurity | Medical and Health | Helthcare providers | Patients | Medical Patient | More than 45 million unique medical images were detected exposed on over 2,140 unprotected servers across 67 countries including the US, UK and Germany. | |
A Lifeguard is Forced to Wear A Speedo Instead of a Thigh Covering Swimsuit | A 61 Year Old Lifeguard is Forced to Wear A Speedo Instead of a Preferred Thigh Covering Swimsuit | New York, New York | 2007 | Exposure | Physical Characteristics | New York State Office of Parks Recreation and Historic Preservation | A lifeguard | A 61 year old lifeguard was forced to wear a speedo instead of his preferred thigh covering swimsuit for a swimming test. | ||
A Man Gets Arrested After Facebook Translated His Post Wrongly | Facebook Translates ‘Good Morning' Into ‘Attack Them' Leading to Arrest | Palestine | 2017 | Intrusion Distortion | Identifying Communication Social Network | Facebook Law Enforcement | A man | Ethnic Minority | Palestinian man questioned by Israeli police after embarrassing mistranslation of caption under photo of him leaning against bulldozer. | |
A Photo of a Woman Crying After Florida School Shooting | The Woman from Famous Photo After School Shooting Says She Hates It | Florida | February 2018 | Secondary Use Surveillance Exposure Appropriation | Identifying Public Life Family Physical Characteristics | Photographer Media | Two mothers | Crime Victims Females | Anxiety Inconvenience | A photographer took a photo of a grieving mother after Florida school shooting. The woman from the picture says she hates it. |
A sweeping drug addiction risk algorithm | A sweeping drug addiction risk algorithm | United States | July 2020 | Decisional Interference | Doctors Pharmacists Hospital | Patients | Females Medical Patient | This article centers on medical patients and vulnerable people like victims of
sexual abuse who were denied intravenous Opiod Medication due to some drug addiction risk algorithms from Narxcare database which played an increasingly central role in the United States’ response to its overdose crisis. It further stated the opinions of victims, scholars and data specialists regarding the effect of Narxcare database usage. | ||
AI Accuses Wrong Person of Police Brutality in Belarus | Belarusian Software Developed for Deanonymisation of Police Officers Beating Up Protesters Identified the Wrong Person | Belarus | September 2020 | Distortion | Identifying Professional Physical Characteristics | Software developers | An ex police oficer | Law Enforcement Officer | Anxiety Loss of Trust | A wrong person was identified by an AI as the one beating up a protester on the streets in Minsk, Belarus. |
AI Being Used to Spread Misinformation and How It Can Affect the Presidential Election | AI: The Spread of Misinformation and the Election | United States of America | July 2023 | Decisional Interference Distortion | Public Life Preference | Deepfake creators Open AI Artificial Intelligence Internet Users | Politicians Government Position Candidates Public Figures Influencers Voters | Voters Candidate | Loss of Trust | Artificial intelligence is being used to generate and spread misinformation which can affect the election. |
AI Deepfake Technology Creates Fake Video of Naked Celebrities | AI Deepfake Show Naked Celebrities | Twitter/Other places on the internet | 29 January 2024 | Intrusion Exposure Appropriation Distortion | Identifying Public Life Sexual Physical Characteristics | Very Low | Taylor Swift | Celebrities Women | Cyberbullying and Online Harassmen Unwanted Explicit Material (Cyberflashing) Psychological Stress | In the year 2023, new technology emerged that allows anyone to create what are called "deepfakes"; images or audios of people, mostly famous individuals, displaying them however the creator of the deepfake pleases. In this specific instance, users on the platform known as Twitter or X, created deepfakes of the popular singer Taylor Swift and posted them to the platform. They gained prominence quickly as these sexually explicit images portrayed Swift in ways that most of her fans would not think to view her. The photos gained over 27 million views, and spread to various other social media platforms. |
AI Impersonation Through Deepfakes Troubles Bollywood | Deepfake struck Bollywood | India | November 2023 | Aggregation Disclosure Appropriation Distortion | Public Life Social Network Physical Characteristics | Fan Journalists Activists | Bollywood celebrities | Celebrities | Embarrassment Anxiety Inconvenience | The spread of deepfake technology has emerged as a disturbing threat for Bollywood celebrities, as seen in recent cases of fake videos surfacing online without consent. This has prompted legal action from stars and greater interest in contractual protections against AI misuse, though India still lacks regulations to specifically address deepfakes. |
AI Pandemic | AI Pandemic | High school in New Jersey and suburban Seattle, Washington. | Insecurity Increased Accessibility Blackmail | Sexual Physical Characteristics Interactions with AI | Individual Deepfake creators | The affected individuals are the teenage girls whose images were used to create explicit AI-generated content without their consent. Additionally the mother and her 14-year-old daughter who are advocating for better protections are indirectly affected. | Women Children and Vulnerable Individuals | Embarrassment Reputational Damage Emotional distress | In high schools in New Jersey and the Seattle suburbs, artificial intelligence (AI) was used to create and distribute pictures of teenage girls in their underwear. According to independent researcher Genevieve Oh, there have been over 143,000 new deepfake videos uploaded online this year. These cases highlight the growing problem of explicit AI-generated content. | |
AI now has the ability to identify passwords based on the sounds of the keys being pressed | AI has the ability to identify passwords based on the sounds of the keys being pressed | United States of America | 8 August 2023 | Intrusion Secondary Use | Authenticating | AI Poeple using the AI | People who use platforms such as Zoom | People | The study published in IEEE European Symposium on Security and Privacy Workshops depicted how a system was able to identify which keys were pressed based on the sound. The researchers pressed 35 keys of MacBook Pro, both the letters and numbers, 25 times in a row using different pressures and fingers. The sound was recorded both over Zoom and a smartphone. After sometime, the system was able to accurately assign the correct key to the sound 95% over the phone, and 93% over Zoom. | |
AIRprint Fingerprint Recognition From a Distance | AIRprint Allows Fingerprint Recognition From a Distance | United States | 2011 | Identification | Identifying Physical Characteristics | Advanced Optical Systems Inc. The United States Marine Corps | Individuals entering military installations | Advanced Optical Systems Inc. has developed a technology that allows identifying individuals by scanning their fingerprints from a distance of two meters. | ||
Abortion Rights in Indonesia | Abortion in Indonesia is Punished With Jail and Fines | Indonesia | October 2020 | Decisional Interference | Medical and Health Family | Indonsian government | Women in Indonesia | Females Pregnant Women | Financial Cost Incarceration | Abortion in Indonesia is illegal, except for two exceptions - in the case of a medical emergency, where the woman also has permission from her husband or family, or in the case of pregnancy due to rape. |
Abusers Using Smartphones to Track Victims | Smartphones Can Be Used By Abusers to Track and Control Their Victims | United States | September 2014 | Surveillance | Identifying Computer Device Location | Abusers | Domestic abuse victims | Crime Victims Females Victim | Changed Behavior Anxiety | Using GPS and spyware abusers often track their victim's smartphones to eavesdrop on conversations and track the location of the victim. |
Access to Face Recognition Search of Moscow Cameras System Is For Sale | In Moscow Access to Face Recognition Search of the City Cameras System Is Possible To Be Bought For 200$ | Moscow, Russia | November 2020 | Secondary Use Surveillance Disclosure | Identifying Contact Behavioral Location | Moscow's Department of Technology Moscow authorities | People in Moscow | Females | In Russia $200 can buy you access to a face surveillance search of the governments 105,000 camera system on a photo of anyone you want, without any questions asked. | |
AccuWeather App Stealing Location Data | AccuWeather App Collecting and Sharing User Data without Consent | Global | August 2017 | Secondary Use Surveillance Exclusion Breach of Confidentiality Increased Accessibility | Identifying Computer Device Location | AccuWeather App | iOS Users of AccuWeather App | In August 2017 AccuWeather was caught collecting user location data, even when location sharing was off. | ||
Advertising Agency Uses Geofencing to Target Women Near Abortion Centers | Advertising Agency Copley Uses Geofencing to Target Women Near Reproductive Health Facilities With Unwanted Ads | Massachusetts | 2017 | Intrusion Surveillance Aggregation | Medical and Health Computer Device Preference Behavioral Location | Copley Advertising LLC | Women entering reproductive health facilities in Massachusetts | Pregnant Women | In an advertising campaign, Copley Advertising LLC used geofencing in order to tag the device IDs of women entering the geofenced area and then serve them anti-abortion ads. | |
Airport Facial Recognition Systems | Facial Scans at the Airports Use Biometric Data to Identify Individuals | Global | 2018 | Decisional Interference Identification | Identifying Physical Characteristics | Customs and Border Protection Airlines using face recognition | Travellers from or to the airports with facial recognition systems | Starting from 2018 at airports around the world, there has been significant growth in the use of biometric facial scans to verify travelers’ identities. | ||
Al Jazeera Journalists Phones Hack by NSO’s Spyware | Al Jazeera Journalists' iPhones Hacked Using Spyware From Israeli Security Company NSO Group | Global | December 2020 | Interrogation | Communication Computer Device | NSO Group | Al Jazeera Journalists | Journalist | Sophisticated spyware was used to hack the phones of 36 Al Jazeera journalists. The hack could be traced back to software made by Israeli security company NSO Group. | |
Alexa Installed In Apartments Before People Move In | Property Management Installs Amazon’s Alexa Into Apartments Before Tenants Move In | Global | September 2020 | Surveillance Exclusion | Communication | Amazon | Landlords and property management | Amazon announced a new program making it cheaper for property managers to install Alexa into rental units before people even move in. | ||
Allstate Data Privacy Lawsuit: Texas AG Enforcement Action | Allstate Data Privacy Lawsuit: Texas AG Enforcement Action | Texas, United States of America | January 2025 | Decisional Interference Intrusion Secondary Use Surveillance Aggregation | Geolocation Driving Habits Movement | Allstate Arity | 45 million Americans | Users of connected vehicles | Infringement on privacy rights Loss of control over personal data Increased insurance premiums | The Texas Attorney General sued Allstate and its subsidiaries for secretly collecting and selling the driving data of millions of Americans, which violated the Texas Data Privacy and Security Act (TDPSA), the Texas Data Broker Law, and the Texas Insurance Code. This is the first lawsuit to enforce a state comprehensive data privacy law. |
Allstate’s Insurance Pricing Algorithm Based on Non-Risk Factors | Allstate’s Car Insurance Charged Charging the Big Spenders Even Higher Rates | United States | 2013 | Aggregation Exclusion | Identifying Transactional | The Allstate Corporation | Middle-aged customers of Allstate in at least 10 States who were paying highest premiums | Financial Cost | The analysis of Allstate's pricing algorithm revealed that it has built a “suckers list” that would charge the big spenders even higher rates, meaning that one of the most significant factors correlated with policyholders’ proposed price shift was non-risk factor, but analysis of their transactional information. | |
Amazon Agrees to Pay $25 Million Fine to Settle Allegations Alexa Voice Assistant Violated Children’s Privacy Law | Create Event: Amazon Agrees to Pay $25 Million Fine to Settle Allegations Alexa Voice Assistant Violated Children’s Privacy Law | United States | 20 July 2023 | Intrusion Surveillance COLLECTION | Behavioral Some personal info | Amazon | Children | Children | Amazon agreed to pay a $25 million civil penalty as part of a settlement with the Justice Department and the FTC to resolve allegations the ecommerce giant’s Alexa voice assistant violated a U.S. children’s privacy law, the DOJ announced. Under the terms of the settlement, Amazon also is required to change its practices relating to the alleged violations and inform consumers of its practices. | |
Amazon Extending Prime into Whole Foods | Amazon.com Inc. Acquires Whole Foods Market And All the Customer Data | United States | 2017 | Surveillance Aggregation | Identifying Contact Computer Device Preference Behavioral Location Transactional | Whole Foods Market Inc. Amazon | Amazon Prime users Whole Foods Market Customers | In 2017 Amazon extended Prime to Whole Foods Market, gaining access to all its grocery purchase information of Whole Foods customers. | ||
Amazon Gives Access to Alexa Recordings to a Wrong User | Amazon Gives Access to Alexa Recordings by Mistake | Germany | 2018 | Insecurity Identification | Identifying Communication | Amazon | One user of Alexa | A user of Amazon's Alexa voice assistant got access to audio recordings of another user sent to him by Amazon by mistake. | ||
Amazon Monitors Employees Private Facebook Groups | Amazon Has a Sophisticated Program to Monitor and Analyze Employees Private Facebook Groups | Global | September 2020 | Surveillance | Identifying Computer Device Behavioral | Amazon | Amayon employees | Employees | Amayon was found to have a complex program to track its employees behavior, specifically their closed Facebook groups. | |
Amazon Sidewalk: The Privacy Risks of Automatic Enrollment in Shared Networks | Amazon Sidewalk Privacy Concerns | Global (Primarily in the U.S.) | 2023 | Decisional Interference Intrusion Insecurity | Location Information Device Connection Status Internet Usage Data | Amazon Inc | Amzon device user (Echo Ring Tile) who were automatically enrolled in Sidewalk without their explicit consent | Privacy-Conscious Users Individuals concerned about online privay Individuals unaware of the opt-out option | Anxiety Inconvenience Potential Misuse of Data Infringement on privacy rights Loss of control over personal data Potential network security risks | Amazon’s expansion of its Sidewalk network in 2023 raised privacy concerns because it automatically enrolled users without their consent. The network allows devices like Echo and Ring to share internet bandwidth, creating a mesh network. This triggered worries about data security, potential tracking, and the loss of user control over personal data. While Amazon claims it uses strong security measures, the automatic enrollment and shared nature of the network raised significant privacy and security concerns. |
Amazon Violates Child Privacy Law | Amazon Violates Child Privacy Law and Pays $31 million | Washington, US | 1 June 2023 | Surveillance Aggregation | Location Children's voice recording | Amazon Inc | Children under 13 years old | Children | Amazon pays a $25 million civil penalty to settle allegations by the Federal Trade Commission (FTC) that it violated the Child Online Privacy Protection Act (COPPA). Amazon was accused of deceiving parents by keeping the recordings of children's voice and location by its Alexa voice assistant for an extended period. Furthermore, Amazon is also required to pay $5.8 million in customer refunds related to privacy violations involving its Ring doorbell camera. As part of the agreement, Amazon is mandated to recondition their data deletion practices, implement stricter and more transparent privacy measures, and delete specific data collected by its Alexa digital assistant. The FTC emphasized that Amazon's actions violated COPPA and compromised privacy for financial gain. | |
Amazon heavily fined for invasion of privacy and insecurity | Amazon fined for privacy violation | United States | 1 June 2023 | Intrusion Secondary Use Surveillance Insecurity Exclusion Breach of Confidentiality | Family Some personal info | Amazon Hackers | Consumer of Amazon products; Alexa and Ring | Children | Embarrassment Loss of Trust Threats Exploitation | Amazon violated the privacy of users by not disclosing their data collection and breaching confidentiality. |
Amazon sued for not telling New York store customers about facial recognition | Amazon sued for unconsentual facial recognition | New York City, USA | 16 March 2023 | Surveillance Exclusion | Identifying Physical Characteristic Interactions with AI Biometric Information | Amazon | Customers | Lack of Consent Loss of Privacy | Amazon did not alert it's New York City customers that they were monitored by facial recognition technology in their Amazon Go convenience stores. | |
Amazon to Pay $25 Million to Settle Children’s Privacy Charges | Amazon Settles for $25 Million Over Children's Privacy Violations | Washington | May 2023 | Surveillance Aggregation Breach of Confidentiality Disclosure Exposure | Identifying Computer Device Preference | Amazon Inc | Children | Minors | Financial Cost Change of Behavior | Amazon has agreed to pay a $25 million civil penalty to settle charges brought by the Federal Trade Commission (FTC) and the Justice Department. The case revolves around Amazon's handling of children's personal information collected through interactions with Alexa, violating COPPA. The tech giant is accused of keeping children's Alexa voice recordings indefinitely, using the data for business purposes, and failing to delete transcripts even after parents' deletion requests. The settlement requires Amazon to delete children's voice recordings and precise location data, as well as inactive Alexa accounts belonging to children, and prohibits misrepresentation of data handling. |
Amazon to pay over $30 million in FTC settlements over Ring, Alexa privacy violations | Ring and Alexa by Amazon in violation of FTC | United States of America | 31 May 2023 | Secondary Use Surveillance | Location Physical Characteristics Children's rights | Amazon Ring Alexa | Ring users and Children | Children and adolescents | Loss of Trust Change of feelings or perception Loss of Privacy | Amazon allowed for ring videos to be downloaded by Ukranian employees without any security measures and Amazon Alexa kept voice data on children. |
Amazon's Alexa Asking Questions | Amazon's Smart Voice Assisntant Alexa Able to Ask Questions | Global | July 2020 | Surveillance Interrogation Aggregation | Communication Preference Behavioral | Amazon | Alexa users | Alexa is now able to ask follow-up questions. | ||
Amazon’s Ring Security Camera | Amazon’s Ring Security Camera Was Found to Have Security Vulnerabilities | Global | 2020 | Secondary Use Insecurity | Identifying Contact Computer Device Physical Characteristics | Amazon | Users of Ring security cameras | The data of users of Amazon’s Ring security cameras were found to be not protected enough. Employees had access to customers’ videos, and hackers were able to hijack the cameras of multiple families. | ||
Amazon’s ‘Just Walk Out’ Tech Is a Privacy Nightmare | Amazon’s ‘Just Walk Out’ Tech Is a Privacy Nightmare | Nwe York | 2023 | Secondary Use Interrogation Identification Exclusion Appropriation | Identifying Account Preference Behavioral Location Transactional History Credit Some personal info Physical Characteristics | Amazon Starbucks | Amazon and Starbucks Customers | Some consequences Loss of Trust Change of Behavior Change of feelings or perception | Amazon and Starbucks are teaming up for contactless purchase in their stores in exchange for customer data and personal information. | |
An App Meant to Stalk Instagram Profiles | An App Meant to Stalk Instagram Profiles | United States | 7 November 2023 | Intrusion Exposure | Social Network Some personal info Profile Information Blocking Status | Wrapped for Instagram | Instagram Users | Children and Vulnerable Individuals Influencers | Potential for identity theft Potential Misuse of Personal Data Reputational Damage | The article discusses the controversy surrounding the third-party app "Wrapped for Instagram," which claims to reveal user statistics but has raised concerns about potential privacy violations and data accuracy. |
Ancestry.com Genealogy Company Giving Access to Its Database to the Police | Ancestry.com Genealogy Company Has Given Access to Its Database to the Idaho Police | United States | 2015 | Disclosure | Identifying Medical and Health Family Physical Characteristics | Ancestry.com LLC Law Enforcement | People who's genetic data was on Anestry.com | Suspect | Anxiety | Ancestry.com genealogy company gave access to its data base to the police of Idaho, who were investigating a 1996 murder case. The person, whom the police suspected after a DNA search, turned out to be not guilty. |
Android Listens to Everyday House Sounds | Google Updates Android Smartphones to Listen to Houses for Suspect Sounds | Global | October 2020 | Surveillance | Behavioral | Android users | Google rolled out an update for its Android system to listen to the houses everyday sounds in order to identify suspicious sounds. | |||
Android Sharing User Physical Activity With Third Party Apps | Google Tracks User's Physical Activity | Global | October 2017 | Surveillance | Computer Device Behavioral Physical Characteristics | Android users | Children Elderly | Inconvenience | In the 2017 Android OS update Google introduced a feature that tracks users' physical activity with phone sensors. | |
Animal Agriculture Industry Tracks and Punishes Critics | Animal Agriculture Industry Groups Monitor Animal Rights Activists and Do Intimidation Campaigns Against Them | United States | October 2020 | Surveillance Exclusion | Identifying Professional Preference Behavioral Social Network | US Department of Agriculture Animal agriculture industry groups | Animal rights activists | Political Activists | Animal agriculture industry groups engage in campaigns of surveillance, reputation-destruction and other forms of retaliation against industry critics and animal rights activists. | |
Anthem Data Breach | Health Insurance Company Anthem Was Hacked by Two from China | United States | 2014 | Interrogation Aggregation Insecurity | Identifying Contact Medical and Health Transactional Credit Demographic | Anthem Inc. Hackers from China | customers of Anthem employees of Anthem | Anthem, a large health insurance company in the United States was hacked by two from China. Nearly 80 million of records of Anthem's customers and employees were stolen. | ||
App Companies and Radio Broadcaster Siriux XM Accused of Sharing Sensitive Data Without Consent | App Companies Share Sensitive Data without Notice and Consent | Texas | 5 December 2024 | Secondary Use Insecurity Disclosure | Location Vehicle Data Consumer Data | SiriusXM | App users | N/A | Lack of Consent Loss of Privacy Security Concerns | The Texas' attorney general is ramping up the implementation of the recently enacted Texas Data privacy and Security Act (TDPSA), which took effect last 1 July 2024. The alleged violators, radio broadcaster Sirius XM, weather app MyRadar, travel rewards app Miles |
App Spied on Belarusian Protesters | An App Mimicked a Popular Anti-Government News Site and Collected Data From Belarusian Protesters | Belarus | August 2020 | Interrogation Identification | Identifying Computer Device Location | Google Unidentified app creators | Belarusian protesters | Political Activists Freedom Fighter | In Belarus in 2020 an Android app mimicked a popular anti-government news site and collected location and device owner details to identify protest goers. | |
Apple Faces Lawsuit Over AirTag Stalking Dangers | Stalking with the AirTag | United States | 6 December 2022 | Intrusion Secondary Use Surveillance Aggregation Insecurity Disclosure Blackmail | Identifying Public Life Ownership Preference Location Knowledge and Belief Social Network | Airtag Stalkers | The two women who filed the lawsuit Owners of iOS or Android devices | Owners of iOS or Android devices | Embarrassment Anxiety Suicide Loss of Trust Change of Behavior Change of feelings or perception | Apple is facing a proposed class action lawsuit over its AirTag tracking devices, with two women alleging that the company failed to implement adequate safeguards against stalkers. The lawsuit claims that AirTags, priced at $29, have become a tool of choice for stalkers due to their affordability. Apple is accused of neglecting warnings from advocacy groups and news reports. The plaintiffs argue that Apple's safeguards are inadequate, especially for Android users, and seek unspecified damages for those tracked or at risk of being stalked. |
Apple Takes Legal Action Against UK Government to Protect Privacy | Apple Takes Legal Action Against UK's Privacy Demand | The United Kingdom | 4 March 2025 | Decisional Interference Surveillance | All Apple Users | Loss of Trust Reputational Damage Privacy Violations Potential Misuse of Data | ||||
Apple consumer privacy concerns | An aggressive approach towards data collection | United States | Disclosure Increased Accessibility | Behavioral Viewed ads App interaction Searches | Apple | iPhone users: specifically those who declined analytics data collection | Consumers using Apple devices and services | Loss of Privacy Potential Misuse of Personal Data | Apple is presently facing multiple class-action lawsuits because of privacy issues. There are claims that Apple has been using iPhone customers' data for analytics purposes without their permission. This issue came to light following Tommy Mysk's research, revealing that Apple's apps, like the App Store and Apple Music, transmit user data regardless of user preferences. The legal actions accuse Apple of breaching privacy and consumer fraud laws, highlighting a significant contradiction between Apple's privacy assurances in its marketing and its actual data practices. As the situation unfolds, Apple has not yet formally addressed these claims. | |
Apple's Sexist Credit Card | Apple's Credit Card Offers Different Credit Limits for Men and Women | Global | November 2019 | Exclusion | Identifying Account Transactional Credit Physical Characteristics | Apple | Apple card owners | Females | Financial Cost | Apple credit card algorithms used to set limits for credits might be inherently biased against women. |
... further results |
Events with harm1