1/28
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Privacy Quotes
“Privacy…is too complex a concept to be reduced to a singular essence. It is a plurality of different things that do not share any one element but nevertheless bear a resemblance to one another. For example, privacy can be invaded by the disclosure of your deepest secrets. It might also be invaded if you're watched by a peeping Tom, even if no secrets are ever revealed. With the disclosure of secrets, the harm is that your concealed information is spread to others. With the peeping Tom, the harm is that you're being watched. You'd probably find that creepy regardless of whether the peeper finds out anything sensitive or discloses any information to others. There are many other forms of invasion of privacy, such as blackmail and the improper use of your personal data. Your privacy can also be invaded if the government compiles an extensive dossier about you.”
“The Right to be left alone” (Louis Brandeis, 1890: Harvard Law Review)
“The desire of people to choose freely under what circumstances and to what extent they will expose themselves, their attitude, and their behaviour to others” (Alan Westin, 1967: Privacy and Freedom)
“The degree to which human information is neither known nor used” (Neil Richards, 2021: Why Privacy Matters)
Ethical and Philosophical Dimensions of Privacy
deontological perspective
utilitarian perspective
virtue eithcs
feminist perspectives
Deontological Perspective
privacy as a fundamental right
not to be infringed upon, regardless of potential outcomes
Utilitarian Perspective
balancing individual privacy vs social benefits
e.g. public health initiatives using aggregated data to prevent disease outbreaks
Virtue Ethics
privacy as part of human flourishing and dignity
protecting privacy shows respect for individual dignity
Feminist Perspectives
power imbalances in data collection and usage
privacy as a tool to protect vulnerable communities
Privacy Paradox
Definition:
“When people disclose personal info in ways that are inconsistent with the high value they claim to place on privacy”
Potential explanations
Rational ignorance: “Too long; didn’t read” (TL;DR) approach to privacy policies
Transparency paradox: Overload of complicated details → People tune out
Control paradox: We like having control, but we rarely exercise it
Disincentivized to protect privacy: “Trade convenience for data”
Question: “Are we actually making a choice, or is it an illusion of choice?”
General Data Protection Regulation (GDPR)
Introduced in 2016
Enforced in 2018
Applies to organizations handling data of EU citizens, regardless of location
PERFORMANCE:
Over €1.5 billion in fines since inception
Criticisms: High compliance costs, unclear guidelines for SMEs
Why GDPR
Address inconsistencies in data protection laws across the EU
Strengthen individuals’ control over their personal data
Respond to high-profile data breaches and growing public concern
What GDPR Does
Establishes principles like data minimization and purpose limitation
Grants rights: access, erasure, portability
Requires consent and transparency
Arguments for Success/Failure of GDRP
GDPR’s effectiveness is subjective, depending on whether we focus on awareness or outcomes It’s a 'successful failure'—effective in raising awareness but challenging in execution
GDPR Real Cases
Major fines on Big Tech (e.g., Google, Meta) for privacy breaches
Meta fined €265m by Irish Data Protection Commission*
Non-compliance: failure to obtain proper user consent or meet GDPR obligations
Google hit with £44m GDPR fine over ads**
SMEs (small & medium enterprises) also face challenges (unclear guidelines)
Austrian website’s use of Google Analytics found to breach GDPR†
Growing cross-border investigations by EU regulators
British Airways fined £20m over data breach††
The EU Artificial Intelligence Act (EU AI Act)
Proposed in April 2021
Expected enforcement by 2025
World’s first comprehensive AI regulation
Similar technologies might fall into different categories, depending on their use
Some feel it’s still too broad or vague, especially as AI evolves rapidly
PERFORMANCE:
Still under refinement, expected to set global standards
Criticisms: Potential to stifle innovation, unclear scope for SMEs
Why EU AI Act
Address risks of unregulated AI applications
Promote trustworthy AI aligned with ethical principles
What EU AI Act Does
Classifies AI systems by risk (unacceptable, high, limited, minimal)
High-risk AI: strict compliance (e.g., medical devices, biometric ID)
Encourages transparency and accountability
Regulations in Other Countries
US:
No unified federal privacy law
Sectoral approach: HIPAA (healthcare), COPPA (children), etc
State-level regulations:
California Consumer Privacy Act (CCPA): Closest to GDPR
Virginia Consumer Data Protection Act (VCDPA)
UK:
GDPR adopted as UK GDPR post-Brexit
Data Protection Act 2018 complements GDPR principles
Focus on balancing data-driven innovation with privacy
China:
Personal Information Protection Law (PIPL): Comparable to GDPR
Cybersecurity Law and Data Security Law
Privacy in AI and Robotics: Why it Matters
Unique challenges posed by autonomous systems:
Continuous data collection via sensors
Need for real-time decision-making
Examples of data types collected:
Visual (cameras in robots)
Behavioural (interaction data)
Biometric (facial recognition, voice)
Purpose of Data Collection:
Navigation and environment mapping (e.g., autonomous vehicles)
Human-robot interaction (e.g., social robots understanding emotions)
Customization of user experience
How Robots Collect and use Data
sensors and data sources
cameras, LiDAR, microphones, wearables etc
Primary Risks of Robots
Unauthorized access or data breaches
Lack of transparency in AI algorithms (black-box problem)
Bias in AI leading to unfair outcomes
Ethical concerns in surveillance applications
Privacy-Preserving Technologies (Against Robots)
Federated Learning: Training AI models locally to avoid raw data transfer
Differential Privacy: Adding noise to datasets to anonymize individual data
Encryption: Ensuring secure data transmission and storage
Federated Learning
A decentralized machine learning training models directly on users' devices without transferring raw data to a central server
How it works:
Devices process data locally to improve the model
Only aggregated updates (e.g., parameter changes) are sent to a central server for model improvement, ensuring raw data stays on the device
Why it’s important:
Protects sensitive data by eliminating the need for centralized data storage
Reduces the risk of data breaches
Example:
Google’s Federated Learning on Android Devices
Used to improve predictive text suggestions without uploading user-specific typing data
Differential Privacy
Protecting individual data points by introducing statistical noise, making it impossible to trace data back to a specific individual
How it works:
Adds randomness to queries or computations on the dataset while preserving aggregate trends
Ensures that outputs provide useful insights without compromising individual privacy
Why its Important:
Balances the need for data utility (e.g., for AI training) with robust privacy safeguards
Makes data breaches less damaging, as individual identities remain hidden
Example:
Apple’s Use of Differential Privacy
Apple employs this method to collect usage statistics (e.g., emoji usage, search patterns) without compromising user privacy
Encryption
A process of converting data into an unreadable format (ciphertext) that can only be accessed with a decryption key
How it works:
During transmission: Data is encrypted before being sent and decrypted upon arrival
During storage: Data is kept encrypted to prevent unauthorized access
Why it’s important:
Prevents eavesdropping, tampering, and unauthorized access during data transfer and at rest
Essential for protecting sensitive information in robotics and AI systems, such as personal data collected by robots
Example:
End-to-end encrypted messaging apps (e.g., Signal, WhatsApp)
New Frontiers of Privacy Concerns
Generative AI & Deepfakes: Identity theft and spread of misinformation
Biometric Data: Privacy challenges from face, voice, and gait recognition technologies
Neurotechnology: Brain-computer interfaces (BCI) raise questions about mental privacy
Consent Fatigue: Endless pop-ups result in users ignoring privacy agreements
Rapid Tech Evolution: Laws struggle to keep up with advancements
‘Nothing to Hide’ Argument
Common Claim: “If you've got nothing to hide, you've got nothing to fear”
Counterpoints (Solove’s Approach):
Aggregation: Harmless data points + harmless data points = revealing conclusion
Distortion: Data taken out of context may mislead or cause harm
Exploitation: Collected data can be weaponized for manipulation
Is Privacy Dead
Solove on the complexity of privacy: Not a single essence but many overlapping concerns, It’s about more than secrecy
Privacy as control: Over personal data
Privacy as autonomy: Freedom from constant scrutiny
Privacy as dignity: The ability to choose how you present yourself to the world
Hopeful Trust
Key Idea: People trust systems even when privacy is violated
Information disclosure can be a pragmatic response to limited privacy protections
Justifications:
“Surely if it were really bad, someone would step in!”
“The company has so many users—it must be safe”
Insight: This “hopeful trust” reveals people want a world where these services are worthy of trust—hence privacy isn’t “dead in our hearts”
User-Centric Privacy Tips
Use strong, unique passwords; enable multi-factor authentication
Regularly review app permissions and privacy settings
Limit oversharing: think before posting personal details online
Be cautious with public Wi-Fi or unencrypted websites (HTTP) --> use HTTPS
Consider privacy-focused tools (VPNs, secure messaging apps)
Key Takeaways
Privacy is multifaceted and context-dependent
New tech (IoT, robotics, AI) magnifies old privacy dilemmas while creating new ones
Existing regulations (like GDPR) are necessary but not always sufficient
Everyone—users, policymakers, developers—plays a role in shaping the future of privacy