Privacy in the Digital Age Midterm

0.0(0)
studied byStudied by 5 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/59

flashcard set

Earn XP

Description and Tags

Dr. Bacsik Fall 2025 UF

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No study sessions yet.

60 Terms

1
New cards

Define — DDoS

Stands for “distributed denial of service”. A DDoS attack occurs when a lot of compromised machines simultaneously flood a target system and are often part of a botnet. They can flood a server, network link, or service endpoint with bogus traffic/requests so that legitimate users cannot access the service. Reveals how fragile networked services are and how attacks can be scaled cheaply. Can be used for censorship, extortion, or political disruption to create civil-societal issues.

2
New cards

Define — RFID

Radio Frequency Identification. Wireless technologies that use radio waves to read and occasionally write small identification tags that can be attached to objects, people, and/or animals. Enables automatic identification and tracking without line-of-sight, the way how barcodes work. Good for retail inventory and supply chains but also shows how notice, consent, etc. need to be explicit to prevent covert tracking.

3
New cards

Define — DNS

Domain Name System. Hierarchical naming system that distributes translated human-friendly domain names to IP addresses by network routers. Foundational internet service since it enables most user-facing applications like email. Point of control and vulnerability, since caches and resolves or root servers are often targets of manipulation. It can be seen as an infrastructural choke-point since it equates name resolution to controlling access/visibility on the internet.

4
New cards

Define — SEME

SEME stands for the search engine manipulation effect. This is a hypothesis that search rankings are systematically biased, which impacts a users’ preferences/opinions even without conscious awareness. This is both psychological and technical. Technically, it ranks algorithms and their opacity. Psychologically, it controls how users trust/internalize implicit authority of search engines.

5
New cards

Define — Mirai Botnet

Notorious botnet that scanned the internet for poorly secured IoT devices just to infect them with malware and conscript them to distributed networks that were used for DDoS attacks. Showed how low-cost IoT insecurity can lead to outsized attacks. Also shows how design choices and supply-chain practices have downstream harms that affect availability/civil infrastructure.

6
New cards

Define — Panopticon

Originally a prison architectural concept by Jeremy Bentham. Circular structure allowed for a single watchman to observe all the inmates without them knowing they were being observed. Foucault used this as a metaphor for modern disciplinary societies, since the possibility of being observed internalizes control. Contemporary surveillance theory extends such metaphor to digital life via online platforms, sensors, algorithmic monitoring, etc.

7
New cards

Define — TCP/IP

Transmission Control Protocol/Internet Protocol. A fundamental communication protocol suite of the internet. Reliable, ordered delivery in exchange for streams of data. Enables end-to-end communication that supports the web, email, file transfer, and more. Usually embodied in an engineering philosophy as an end-to-end design that places intelligence at network edges whilst the core network remains simple. Shapes where control can be exercised, where regulation is able to intercede, and if/how resilient/censorable a network might be. An IP is a set of rules governing the format of data sent over the internet or other network.

8
New cards

Define — Internet

Global system of interconnected computer networks using the Internet Protocol Suite (TCP/IP) to communicate. It is a technical stack and socio-political system. Includes hardware, protocols, addressing, routing, commercial platforms, legal regimes, and cultural practices. Examines how standards and companies/states shape behaviours. Is an assemblage that mediates speech, commerce, surveillance, and civic life.

9
New cards

Define — Cyber-Physical Systems

Tightly integrated systems where computational elements control and interact with physical processes. Includes industrial control systems, autonomous vehicles, smart grids, and IoT deployments. Combines sensors, networked communication, control algorithms, and actuators. Raise privacy/safety challenges like data flow and their real world impacts. Demands intense security engineering. Poses governance questions about liability and regulation.

10
New cards

Define — Constrained Network Node

Device in a network with limited resources which is common for many IoT/sensor deployments. Shapes system design via security primitives, cryptographic schemes, and privacy-by-design strategies. Complicates secure, privacy-preserving systems and how developers must adopt patterns to mitigate risk.

11
New cards

Define — NotPetya

Destructure cyberattack in 2017. Initially appeared as ransomware, but actually was designed to permanently destroy data on any infected machines. Spread via lateral movements and exploitation of software-update mechanisms/credential reuse. Caused massive damage to infrastructure, public services, and damage to businesses too.

12
New cards

Define — Predictive Policing

Uses algorithmic models trained on historical crime data to forecast where/when crimes are likely to occur/where individuals will be involved. Optimizes resource allocation for law enforcement. Some issues include biased training data that produces feedback loops, resulting in over-policing certain communities. There is also a lack of accountability which raises ethical questions and concerns for efficacy.

13
New cards

Define — Stuxnet

Sophisticated computer worm discovered in 2010 that targeted industrial control systems to cause physical damage to Iranian nuclear centrifuges by subtly changing control commands even though it was reporting normal operation to monitoring systems.

14
New cards

Define — Souseveillance

Literally meaning “undersight” in French. Constitutional Penopticon. Posits that recording/monitoring those in power will help democratize visibility and accountability since subordinates observe the authorities.

15
New cards

Define — Net Neutrality

Internet service providers should treat all lawful internet traffic equally. Should not throttle or preferentially prioritize a certain service as it preserves competition, free expression, and innovation. Whoever controls traffic

16
New cards

Define — Surveillance Capitalism

Economic system where companies collect, analyze, and commodify personal data to predict/modify human behavior for profit. Does so via targeted advertising, personalization, and behavioural predictions. Personal data is used to create such predictive features. Explains why platforms can gather behavioural traces and how privacy is eventually eroded by rationalizing narratives.

17
New cards

Define — Biometric Data

Measurable/biological characteristic used to identify/authenticate individuals. Includes facial recognition, fingerprints, voices, physiological markers, etc. that are intrinsic to bodies and often immutable. Raises acute privacy, consent, and fairness challenges. Can be intrusive and error-prone to certain demographic groups.

18
New cards

Define — Haptic Technologies

Create/simulate the sense of touch through tactile feedback. Can be used in a surgical simulator or advanced prosthetic. Brides virtual and physical sensations by producing a touch signal the wearer perceives to be realistic. Raises novel data/embodiment questions like identifying users emotional or health issues. Feeds predictive systems by prompting design and ethical considerations about consent and data minimization as well as the interpretability of such sensed signals.

19
New cards

DeNardis — Number of Twitter accounts that are actually bots / effect of bots and fake accounts on social media metrics / their purposes (pg 14)

Between 9%-15% of accounts on Twitter are bots (3.2 million suspicious accounts are caught by automated systems every week but not all of them are bots). The bots are quickly able to repost posts made by users to help the post gain traction or generate content for people to consume. Distorts engagement metrics and amplifies certain messages in a way that can create a false impression of reality. Some bots are for customer service or marketing while others are for misinformation/disinformation to be spread across operations.

20
New cards

DeNardis — TCP/IP

Transmission Control Protocol / Internet Protocol's design decisions can embody governance choices such as where intelligence will sit in a network, kinds of censorship there, and how surveillance is exerted. Helps understand how power is exercised via Internet architecture. Some connectivity features have become de facto standards in a low-computational-resources environment.

21
New cards

DeNardis — Understand what she means by the "fourth industrial revolution" and what examples of it might be (pg. 34-40)

First mentioned in Klaus Schwab’s book in 2016. This one is characterized by “a fusion of technologies that is blurring lines between the physical, digital, and biological sphere.” Rapid integration of digital computation and autonomous systems and networking into the physical world. These info-processing abilities change production, transportation, cities, healthcare, etc. that differs from previous industrial revolutions in its use of cyber/physical systems. Economic focus can include conveying the material embeddedness of digital technologies in digital sectors. This is important because it moves regulatory attention from purely informational harms to safety, reliability, accountability, and rights in the physical world — the stakes are no longer just data privacy but bodily safety, physical infrastructure resilience, and national security.

22
New cards

DeNardis — Understand how transduction occurs with IoT devices--What does she mean by transduction?

This is defined as converting physical states/events into digital signals as well as vice versa. Sensor-to-data and data-to-action chain. For example, thermostats convert physical temperature into electronic signals that are processed in the cloud which then sends a command to the HVAC to change the physical heat.

23
New cards

DeNardis — What are the security and policy implications of IoT transduction?

IoT transduction is a new means of surface attack. Tampering with data or commands can result in physical harm (think of hacking a car). This is a point of vulnerability for manipulation, surveillance, and attack. Regulations shouldn't focus just on content or speech but also safety, updating mechanisms, minimum standards, liability, and disclosure.

24
New cards

DeNardis — What is autonomous machine learning and what are its policy implications (pg 47-51) / What is a constrained network node?

Systems that automatically select models, tune hyperparameters, retrain, and deploy ML models with minimal human oversight. Adaptable in real time depending on sensor input. Policy implications include opacity/accountability being difficult to assign, safety-critical risk as model drift/unexpected behaviour causes harm, scale of effect since it enables decisions at machine-speed/scale to multiply errors quickly, and governance demands that include autonomy specifically being designed into the cyber-physical systems. ML systems are becoming ever-greater autonomy, no longer fixed in what a human codes.

25
New cards

DeNardis — Tech surveillance issues including Samsung's "smart" TV (pg. 69)

Smart TV's include microphones and cameras that record people constantly. The networked components can capture conversations and habits. Extends surveillance into homes and blurs the line between a monitored environment vs a private space. Requires regulation beyond notice/consent like default privacy-protective settings, limiting sensors so they aren't always on, and more. Samsung responded with a software upgrade and accentuated the importance of upgradeability in consumer IoT devices.

26
New cards

DeNardis — Human implantable RFID chips history (pg. 71-72) Know what uses they were marketed by Three Square Market and the Danish company BiChiP

The 3 squared market was how in 2017 the US offered RFID impact options to employees to access doors and make micro-market processes. It also controlled computer login and other convenience functions. Voluntary project, but received a lot of privacy/security backlash. The BiChip was a European microchip implant firm that marketed similar abilities but it stored small amounts of health or ID data, leading to potential function creep and coercion. DeNardis framed the uses of implantable RFID as an example of how IoT expands surveillance into and onto bodies, creating governance problems where consent and data protection collide with bodily integrity and workplace power dynamics.

27
New cards

DeNardis — Know about the collection of biometric data and its potential for monitoring emotional states and health conditions, including Strava's heat map of exercise activity that can be used to measure what kind of other activity (pg. 73-74)

Can reveal physiological signals, be processed to infer emotional arousal/health conditions, and potentially nonconsensual monitor mental and physical states. Combining biometric signals with identity data data or contextual metadata will allow for intrusive monitoring. Strava's global heat map showed visualized billions of GPS tracks from fitness that showed the locations and movement patterns of personnel at military bases. Biometric transduction can be an important cybersecurity/physical access technique, though.

28
New cards

DeNardis — Understand China's social credit system and its parallels with and differences from Jeremy Bentham's Panopticon prison design (pg. 76-77).

The social credit system involves combining data about behaviour from multiple sources (payment history, speeding tickets, etc.) to create systems enabling rewards or punishments based on perceived trustworthiness. Parallels with the panopticon include how both rely on monitoring/incentives to shape behaviour while a major difference is the specific legal/political institution that administers the consequences. The social credit system is more distributed than the panopticon, and aggregates more data as well.

29
New cards

DeNardis — Also, understand how the effect of data aggregation in the private sector in the US can have many of the same effects as the Chinese social credit system. (pg. 80-81)

In the US private sector actors can aggregate data that is location, finance, or behaviorally based. That is used to make a predictive score and automate decisions on said score, relating to how high they should charge you for, say, car insurance. Permeates the material world to produce stratified access/social sorting that resembles social credit consequences even though it is not state-run.

30
New cards

DeNardis — Know what the Fair Information Practice Principles are. (pg. 83)

Stands for fair information practice principles. FIPPs are classic privacy guidelines — notice/awareness, choice/consent, access/participation, integrity/security, and enforcement/oversight (variously formulated in policy documents), transparency, individual participation, data minimization, use limitation, and accountability/auditing. They formed the basis for many privacy laws and best practices.

31
New cards

DeNardis — Also, know how and why the FIPPs represent a move from a focus on individual privacy & consent model to a multistakeholder data security model and what this means for regulatory control. (pg 83-92)

DeNardis thinks we should move to a multistakeholder security model that emphasizes forensic accountability, secure update regimes, minimum-security standards, and shared governance across vendors, standards bodies, civil society, and states. This rebalances privacy from an individual contract to an infrastructural responsibility. In terms of regulatory control, this means that regulators must move beyond only protecting “individual choice” and instead require systemic guarantees (e.g., secure design, logging, independent audits). This creates a governance field that includes technical standardization bodies, liability rules, certification regimes, and oversight.

32
New cards

DeNardis — DeNardis says it is essential for the Internet of Things to have both the technological possibility of forensic accountability and multistakeholder privacy--why? (pg. 86-92)

Forensic accountability is crucial to attribute harm, deter malicious behaviour, and respond to incidents. Multistakeholder privacy recognizes how privacy and safety can't be ensured by individual consent alone. Shared governance is needed in a way that involves standards being regulated by approved bodies, civil society setting baseline norms for security and privacy, and more.

33
New cards

DeNardis — Understand what DeNardis says about the content-centric focus of regulations vs. what's needed in cyber-physical reality (pg. 180) Also understand how this represents a paradigm shift (pg. 214) and redefines what kinds of policy responses we need. (pg. 215)

The old, content-centric regulation model focuses on communications content. The cyber-physical reality model is the new one, which DeNardis argues she should shift to policies that account for how the internet is both a coordination and a control system. The rules need to address device safety, operational resilience, updateability, interoperability, and cross-sector dependencies. This is a paradigm shift because policy moves from protecting expression and privacy only to ensuring physical safety, system accountability, and infrastructural governance.

34
New cards

DeNardis — Understand how the role of private companies (pg. 220-221) and Section 230 of the Communications Decency Act (pg. 222) give rise to some level of privatized governance (pg. 223)

Private companies are both a mechanism for and source of resistance for pushing back against restrictive information policies. Private governance's role is that private companies design standards, control core infrastructure, and run the largest platforms. That means that they exercise de facto governance, since the choices they make in content moderation has regulatory impacts. Section 230 of the Communications Decency Act gives platforms immunity for third-party content moderation/hosting decisions. This strengthens private firms' discretion to set certain rules. DeNardis notes how the legal framework helps explain the rise of privatized governance since the responsibilities and powers are concentrated on corporate platforms instead of a public regulator, which causes people to ask about democratic oversight and its transparency/accountability.

35
New cards

DeNardis — Know what happened and how in the "WannaCry" ransomware attack. (pg. 101-102) / Who is thought to have developed the malicious code used and who is thought to have used that code in an attack? (pg. 102) What does this event reveal about hoarding knowledge / "stockpiling" of software vulnerabilities? (pg. 97, 102) / stockpiling vulnerabilities (pg. 121-126)

Ransomware that exploits NSA-developed vulnerability by encrypting files and demanding ransom payments in Bitcoin in exchange for a promised description key that would allegedly unlock the disabled systems whilst spreading laterally across networks in Microsoft Windows. The kill switch discovered by a researcher helped to slow the spread, but hundreds of thousands of systems still impacted the worldwide. The exploit is reported to have been developed by an intelligence actor then later learned. Attribution for WannaCry's operational use has been linked to places and groups (NSA) associated with North Korea by multiple security firms and governments. The stockpiling problem resulting from this is how weapons like this are turned against civilian infrastructure. DeNardis emphasized that exploiting vulnerabilities results in increased systemic risk. Example of how governments sometimes withhold knowledge of software vulnerabilities/developing cyber weapons to exploit it instead of reporting it to a product developer.

36
New cards

DeNardis — Understand DeNardis's argument against encryption backdoors and the normalization of cyber risk. (pg 125-131)

Weakening encryption undermines global security, since backdoors create systemic vulnerabilities that an adversary could exploit if they were to discover it. Cryptography must be a strong foundation to these trustworthy cyber-physical systems. We need to accept that there are permanent, unresolvable vulnerabilities and/or weaknesses that increase collective exposure. If we treat it as "business as usual" then the infrastructure will get even more brittle as time goes by.

37
New cards

DeNardis — Know what happens in spear-phishing attempts and know what happened in the "Grizzley Steppe" example (pg. 103, 106)

Spear-phising is targeted phasing messages crafted to trick specific individuals via personal details. This reveals credentials or other things, and is a favored initial access vector to exploit intrusions since it relies on people's inherent trust. Grizzly Steppe is a Russian malicious cyber activity including spear-phishing campaigns and malware operations tied to Russian intelligence actors. The reports highlighted how targeted credential theft and phishing were used to penetrate political and infrastructure systems. DeNardis cites such operations to show how state-linked campaigns leverage social engineering plus infrastructure vulnerabilities.

38
New cards

DeNardis — Understand the security value in a lack of interoperability / IoT standardization and interoperability as both desirable and a source of increased risk / technology standardization protocols pro and con (pg. 22-24) / "security by design" vs. use of component parts, obsolescence, need for encrypted updates (pg. 111-115) and what factors hinder encrypted updates for IoT devices.

Lack of interoperability is a security benefit seeing as fragmentation can act as a brake on systemic propagation of attacks. If not everything is compatible then the exploitation would have to negatively impact every ecosystem to be as efficient as it is currently. Interoperability and standards enable economies of scale, easier management, and broad innovation, but they also create common vulnerabilities, since a flaw in a standardized protocol or widely-used component can be weaponized across many devices. Thus standardization increases both utility and systemic risk. Security-by-design would be ideal because if they have a standard that devices are built with secure defaults, strong authentication, encryption, and a plan for updates it would be best. But many manufacturers assemble off-the-shelf components, prioritize time-to-market, or omit long-term update obligations; constrained nodes and business models (planned obsolescence) hinder encrypted updates. DeNardis stresses that lack of secure update mechanisms, closed-source firmware, and supply-chain complexity impede post-sale security and accountability.

39
New cards

DeNardis — 3D printers = are they manufacturing or information sharing? What are the product liability, product safety, and legal limits on access challenged by this new capacity, what is the DoJ response and why? (pg. 185)

3D printing involves both information and manufacturing. Files encode production instructions (information) which results in physical artifacts (manufacture). Legal questions were raised on a student who wanted to 3D print guns with the argument that distributing a digital design counts as "manufacturing", or if it was protected speech. Authorities have been concerned about unregulated distribution of weapon designs and safety risks; prosecutorial and regulatory responses balance free expression with public safety. DeNardis flags this as part of a broader terrain where information governance intersects with product safety and enforcement.

40
New cards

DeNardis — GDPRs global effects and the institution of the WHOIS system (pg. 201-202)

GDPR effects include how Europe's GDPR changed data collection/cross-border processing norms. The global impact was a result of services worldwide processing EU data having to comply and motivating regulatory harmonization/enforcement. WHOIS is a freely available/searchable global directory that was GDPR’s privacy protections led to redaction and limited public access to WHOIS registration data (domain registrant information), affecting longstanding Internet governance practices reliant on WHOIS for attribution and abuse mitigation — prompting debates about balancing privacy with operational security and abuse-handling. DeNardis discusses these tensions in the governance chapters.

41
New cards

DeNardis — The reasons for viewing the internet as primarily a communication system or as a cyber-physical system and the effect of one's view of the internet's purpose and uses of regulation. (pg 17)

The internet-as-communication regulation model focuses on content rules and intermediaries for expression. Internet-as-cyber-physical systems focus on safety, resilience, control, and how that all impacts the physical consequences of networked systems in the real world. Whichever paradigm you choose will determine what policy instruments are prioritized.

42
New cards

DeNardis — Cyber vs. Internet language choices (pg. 190-192)

DeNardis notes that “cyber” often evokes military/security framing while “Internet” evokes communication and civil-society framing; word choice shapes policy priorities and public perception. Choosing “cyber” can push a defense/security posture; “Internet” can emphasize openness, rights, and communications governance. She argues precise language is politically consequential for governance design.

43
New cards
Know the four panopticon designs Bentham proposed as discussed in Bentham, Deleuze and Beyond.
He proposed four panopticon designs that all applied the principle of a visible but unverifiable surveillance to multiple social institutions. The constitutional panopticon is how citizens watch the governors, which is sometimes called the reversed or inverted panopticon since now it is the many watching the few. Chrestomatic panopticon designs include schools, based on the scholar-teacher principle where more advanced students teach the less advanced. The prison panopticon is the original idea of one watchman watching many prisoners. In the pauper panopticon, which is most similar to the prison version in design, but they are able to come/go as they please. Many people have mentioned how his architectural model became a metaphor for modern surveillance systems that operate through constant observation and normalization of behaviour, though later thinkers like Deleuze saw surveillance as shifting from an enclosed institution to a networked control society.
44
New cards
Identify the problem with “essentialist thinking” as described in The right not to be subjected to AI profiling....
They argue that it is treating categories (race, behaviour, gender) as fixed, predictive, and natural, but this aspect of algorithmic profiling is a core problem. It reifies social categories as a social or biological truth, legitimizing the discrimination via AI systems even if it claims neutrality. Essentialism can reduce an individual’s datafied identity that ignores the social context and human variability. This then goes to undermine equality and privacy, leading to the rejection of essentialist logic and a push to turn towards relational/contextual understandings of identity in algorithmic governance.
45
New cards
Know what kind of regulation Blake Murdoch calls for regarding health data in Privacy and artificial intelligence — challenges for protecting health information in a new era.
He wanted a strong, cross-sectoral framework that was regulatory when governing AI in the context of health-related issues. He emphasized that comprehensive, enforceable legal standards beyond the traditional consent-based privacy model would help us move past the current, insufficient laws that fail to address the scale and opacity of AI data processing. He wants proactive, rights-based regime laws that anticipates harm instead of reacting to a privacy breach. Examples — independent regulatory oversight bodies, clear liability rules for misuse of data, and mandatory algorithmic transparency/auditing.
46
New cards
Identify the two models for the creation of artificial intelligence described in Situating methods in the magic of big data and AI
The symbolic/rule-based model is how AI is built through explicit programming and logical rules that assumes reasoning can be formalized in code by humans. The machine learning/statistical model is where AI generated by training algorithms on large datasets learn patterns then make predictions w/o explicit human-given instruction. The difference in epistemological traditions are noted in both. One is rooted in human-defined logic whereas the other is data-driven pattern discovery.
47
New cards
Know the study results for the SEME effect on undecided voters according to the Epstein and Robertson report.
Stands for the search engine manipulation effect. Their research studies how biased search rankings influence voter preferences, and that this effect is particularly strong amongst undecided voters. 20% or more of the preferred candidates after exposure to biased search results. Depending on the demographic, shifts can be up to 80%. They reported that over 3/4ths of the participants weren’t even aware of the bias, which makes its influence invisible (essentially).
48
New cards
Understand the data used in predictive policing and its history according to the review paper by Meijer and Wessels.
They explain how these systems emerged from data-driven policing systems in the 1990s. Such results are a predictive policing system based on geospatial information, historical crime data, and sociodemographic indicators. These produce obvious biases that embeds systemic inequality, making discretion an algorithmic authority that unfairly impacts certain groups.
49
New cards
Exposing Pegasus — Know who Latifa Al Maktoum is and how Pegasus was used against her (part 2 around 5 to 16 minutes)
The daughter of Mohammed bin Rashid Al Maktoum (ruler of Dubai) who attempted to flee the UAE but was intercepted in the Indian ocean and forcibly returned. According to the documentary, Pegasus spyware was used to target her phone and communications.
50
New cards
Exposing Pegasus — Know who Cecilio Pineda is and how/why Pegasus was used against him (part 2 around 25 minutes)
Freelance Mexican journalist who publicly denounced the corruption and links to organized-crime in Guerrero. He was assassinated in March of 2017. Before then, he appeared on the leaked list of Pegasus surveillance for Mexican targets of NSO. It appears to have been because he investigated corruption. This is the use of the tool beyond the declared purpose of counter-terrorism.
51
New cards
Exposing Pegasus — Understand the reason Pegasus was called the "diplomatic currency" of Israel (part 2 around 22 to 23 minutes in)
Spyware is seen as a precious diplomatic tool currently as it tried to normalize the relations/fighting global campaigns. NSO served as a form of currency for their gulf diplomacy. By supplying it to allied governments, they gain leverage/relationships to use surveillance as a strategic export.
52
New cards
Exposing Pegasus — Know what effects the news reports synchronously published after one year of research by many journalists had on NSO (near the end)
There were joint investigators including activists, politicians, and journalists. The effects for NSO was reputational damage, global scrutiny, and government action. The synchronous publication of investigative reports amplified global awareness and pressure, forcing accountability and regulatory responses for NSO and the wider spyware industry.
53
New cards
In the Age of AI — Why does Kai-Fu Lee, the CEO of Sinovation Ventures, call China the Saudi Arabia of data? (early in the show)
He argued that data is the key resource for AI in the way that oil was for early industrial powers. China has vast data which serves as “fuel” for 21st-century AI and algorithms. Thus China’s data volume gives it a strategic advantage.
54
New cards

In the Age of AI — What does Kai-Fu Lee say are the three technological revolutions that fundamentally alter economic structures of society? (at approx. 45 minutes)

The Industrial revolution, which fundamentally changed the way we are able to produce goods and how much we can produce. The information revolution – digital networks connecting people and machines globally spurred on by the internet. Artificial Intelligence / data-driven algorithmic revolution – where “data + algorithms” become the new production engine, disrupting jobs, industries and economic models.

55
New cards
In the Age of AI — Who said, "Karl Marx was right" (minute 1.02 through 1.05) and why does he say this? Why might it be surprising that he said it?

Jerry Kaplan, a libertarian who argued that automation is substitution of labor for capital and people with capital win. Struggle between capital and labor, but AI is capital focused not on labor, which is why Karl Marx was right. This is surprising because of his political party.

56
New cards
In the Age of AI — What rights are included in the California online privacy legislation that real estate developer Alastair Mactaggert proposed and promoted with $4million of his own money? (1.08 through 1.26)
The right to control what personal data companies collect about you. The right to delete personal data held by companies. The right to opt out of the sale or sharing of personal data for profiling or advertising. The right to know what personal information is collected and how it’s used. California Privacy Rights Act (CPRA) and California Consumer Privacy Act (CCPA) include the ability to correct, delete, transfer personal information; limit use of “sensitive personal information”; opt out of sale/sharing; non-discrimination for exercising rights.
57
New cards
Facebook Dilemma — Understand the nature of the problem with the algorithm used to drive engagement metrics.
The algorithm was optimized for growth and engagement — keep users on the platform, clicking, reacting, sharing. That meant the algorithm prioritized content that provoked strong reactions (often emotional, sensational, divisive) because that drove longer usage and more ad revenue. The main problem with this is that content isn't the only thing generated, because it amplifies and selects the content that maximizes engagement, which can lead to polarizing, misleading, or harmful content being surfaced. Because the algorithm’s objective is engagement, not truth or user well-being. More broadly, research shows that ad delivery and algorithmic optimization on platforms like Facebook can lead to skewed outcomes.
58
New cards
Facebook Dilemma — Understand the legal framework that protects internet platforms from liability.
The key legal framework is Section 230 of the Communications Decency Act (1996) in the U.S. Under Section 230(c)(1) a provider of an interactive computer service (such as a social media platform) is not treated as the “publisher or speaker” of content provided by another information content provider. That means platforms generally have immunity from liability for content posted by their users (third parties). In addition, Section 230(c)(2) provides “Good Samaritan” protection — platforms can remove or restrict access to content they deem objectionable, without being liable for doing so.
59
New cards

Define —- GDPR

The General Data Protection Regulation is the European Union's (EU) data privacy and security law that sets strict rules for how organizations handle the personal data of EU residents. It gives individuals more control over their personal information by establishing rights like access, erasure ("right to be forgotten"), and data portability, and requires companies to obtain consent and be transparent about data collection. The regulation applies to any organization, regardless of location, that processes the personal data of individuals in the EU.

60
New cards

WHOIS

Was an internet protocol and database that provides publicly accessible information about a domain name or IP address registration. It answers the question "Who is responsible for this domain?" by listing the registrant's contact information, registrar, registration and expiration dates, and name servers.