Legal Aspects

0.0(0)
Studied by 6 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/85

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 5:28 PM on 4/3/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

86 Terms

1
New cards

What is private law?

Private law deals with relationships between parties at the same level, both pursuing private interests — neither holds authoritative power over the other.

Authoritative powers are only granted in cases of public interest. One party may still have contractual power over the other.

Public law, by contrast, involves relationships where one party holds authoritative power over the other, acting in the public interest.

2
New cards

What are legal persons?

They are constructs, for example, a company can be one.

Both natural persons (flesh-and-blood people) and legal persons (constructs like companies) can be on either side of a legal relation.

3
New cards

What are active and passive sides?

The active side is where a person holds a legal right — either

  • absolute (held against the whole world, e.g. bodily autonomy, property, ownership) or

  • relative (held against one specific party, e.g. a debt from a contract).

The passive side bears a corresponding

  • duty (in respect of absolute rights) or

  • obligation (in respect of relative rights).

Legal relations are always social relations between people. This means that the law doesn't regulate things in isolation — it regulates how people relate to each other through things.

4
New cards

What are the functions of the law?

  • Promoting reliance/trust creation: law creates a stable environment so parties can transact with confidence in a recognised legal framework. This is particularly important for the market.

  • Filling in the gaps: if parties don't specify every detail, default rules apply. For example, if you don't write a will, the law applies automatically.

  • Limiting private autonomy: mandatory and compulsory rules restrict what parties can legally do, often by stripping invalid agreements of any legal effect rather than applying criminal sanctions. Private and criminal law can go hand in hand, but you do not go to jail for a private law violation alone.

5
New cards

What is private autonomy?

It is the first main principle of private law.

It means parties can rule their legal relations as they wish. Within the market (economic transactions), these arrangements are binding. Outside the market — for example, an agreement to sell a kidney — they are not binding and have no legal effect.

Includes contractual freedom, marital freedom, and testamentary freedom.

6
New cards

What are the limits on private autonomy?

  • It doesn't forbid an act but requires it to be done in a specific legal form (e.g. selling a house must be done in writing, not orally),

  • It strips certain acts of all legal effect, so they simply cannot be enforced (e.g. you cannot sue a buyer who backs out of an illegal kidney sale).

7
New cards

What is liability?

A second main principle of private law.

Legal effects can be imposed on parties irrespective of their will — if you hit a pedestrian, you must pay damages. Ownership creates liability. Having children creates liability obligations toward them

8
New cards

What are the main principles of private law?

  • Private autonomy is the starting point — parties are free to shape their own legal relations as they wish. The law supports this by promoting reliance (making agreements enforceable), filling in gaps (providing default rules where parties haven't decided), and setting limits (mandatory rules that cannot be contracted around).

  • Liability is what kicks in where autonomy ends. When legal effects arise not from a choice but from a situation — causing harm, owning something, having dependants — the law imposes obligations regardless of will.

9
New cards

Civil law VS Common law

  • Italy, France, Germany, Spain, the law comes from the state; judges are only the 'mouth of the law' — they apply it neutrally without democratic legitimacy to create it. Only parliament can say what the law is.

  • UK, USA, Canada, Australia, the law comes from the judge — judges find the law, and precedents are formally binding on future courts.

10
New cards

What is the hierarchy of sources of law?

  1. Constitution (top — requires broad consensus to amend)

  2. Primary sources enacted by Parliament, limited by the Constitution (government decrees require parliamentary approval)

  3. Secondary sources / regulations enacted by government and public administration

  4. Customs and usages (bottom-up, not compulsory, very limited space)

11
New cards

What is the judicial hierarchy in Italy?

  1. Tribunal (first instance)

  2. Court of Appeal (facts established here)

  3. Corte di Cassazione / Supreme Court (only the interpretation of law is discussed here, not the facts).

  4. The Constitutional Court checks whether laws respect the constitutional hierarchy.

  5. The EU Court of Justice checks compatibility with EU law.

12
New cards

What are the sources of law for the EU?

  1. Primary EU sources: international treaties (e.g. the Lisbon Treaty) give the EU its legitimacy and power to legislate.

  2. Secondary EU sources — enacted under that treaty authority — sit between the Constitution and national laws in the hierarchy.

    1. Directives: general principles that each member state must implement in its own way; not directly applicable.

    2. Regulations: directly applicable in all member states with no intermediate national step required.

The EU can only legislate in fields where member states have given up their autonomy — typically related to the creation and functioning of the common market.

13
New cards

What approaches can be taken to regulate new technologies?

  • Precautionary principle: if a new technology may cause irreversible harm before we fully understand it, we should hold off. This principle asserts that protective action should be taken against uncertain threats rather than waiting for conclusive proof. It allows early intervention to channel technology in the right direction.

  • Allow and assess: let markets develop the technology until we have enough data to regulate. However, ethical stances can override this (e.g. human cloning is possible but moral concerns lead us to say no).

Because technology changes rapidly, laws that refer to specific technologies quickly become outdated. The solution is technologically neutral laws: defined by the output or function of technology, not its technical details. These remain applicable even as the technology itself evolves.

14
New cards

What’s the difference between privacy and data protection?

  1. an individual concern about one's exposure to the public — including decisional privacy (e.g. abortion, changing sex).

  2. a broader legal framework governing how personal data is collected and processed by others.

15
New cards

How did data protection begin?

First data protection laws appeared in West Germany in the 1970s as a political message against public surveillance (implicitly directed at East Germany).

The first coordinated international effort was the 1981 Council of Europe Convention against automatic processing of personal data — soft regulation aimed at limiting non-consensual processing and giving data subjects rights of access.

The EU Directive of 1995 was updated to the GDPR in 2016, still the cornerstone of personal data protection.

16
New cards

What actors does the GDPR apply to?

  1. anyone established in the EU who collects data (from anywhere);

  2. anyone offering services to EU consumers and collecting data in the process;

  3. anyone monitoring the behaviour of EU citizens.

17
New cards

What is the Brussels effect?

When the EU adopts strict rules — on data protection, food safety, product standards, AI — any company that wants to operate in the EU market must comply with them. For large multinationals, it is often simply more efficient to apply the EU standard everywhere rather than maintain separate systems for different regions. The EU standard effectively becomes the global default.

The GDPR is the clearest example. When it came into force in 2018, companies around the world updated their privacy policies, added consent mechanisms, and restructured how they handle personal data — not because their own governments required it, but because they were operating in the EU market and it was easier to apply one consistent standard globally than to run parallel systems.

18
New cards

What are the key actors in data protection?

  • The data subject is the person the information refers to.

  • The controller decides to process data and determines the purposes and means — they bear more responsibility.

  • The processor processes data on behalf of the controller

The rules are general; it is up to the controller to decide how to implement them given the specificities of their situation.

Examples: a company using Google Drive is the controller; Google is the processor. If Alexa records ambient audio, Amazon is the controller (the device was purchased for another purpose, but Amazon determines how the data is used).

19
New cards

What are the core principles of data protection ?

  1. Lawfulness, fairness, and transparency: controllers must be clear about what data is collected, why, what rights subjects have, and how to exercise them. Transparency is about both the content of information and the ease of accessing it — there must be a balance between the amount of information given and people's real attention spans (the 'privacy paradox': people say they care about privacy but don't read terms and conditions).

  2. Purpose limitation: data collected for one purpose cannot freely be repurposed. This can be restrictive — e.g. accessing health data for medical research can be challenging. It is meant to prevent 'function creep': the gradual widening of a system's use beyond its original purpose, leading to privacy invasion.

  3. Data minimisation: collect only what is necessary for the stated purpose; retain it only as long as needed.

  4. Accuracy: personal data must be kept up to date and relevant

  5. Integrity and confidentiality: data must be processed securely; only authorised personnel should access it.

  6. Privacy by design: data protection measures must be built into processing from the very start, not added as an afterthought.

  7. Privacy by default: the most privacy-friendly settings must be the default (e.g. social media profiles should not be publicly accessible to an indefinite number of people from the start).

20
New cards

What are the legitimate bases for processing data?

1. Consent of the data subject.

2. Performance of a contract.

3. Compliance with a legal obligation.

4. Protection of vital interests.

5. Performance of a task in the public interest.

6. Legitimate interests of the controller (subject to balancing against the data subject's rights).

21
New cards

What are the requirements for valid consent? (FUCSAW)

  1. Freely given: no undue influence, nudging, maze effects (constantly redirected to new pages)

  2. Specific and granular: consent must relate to a defined purpose; it cannot be broad.

  3. Unambiguous: a clear affirmative action (e.g. clicking a box) is required.

  4. Age of consent: generally 18; some cases allow 16. Italy chose 14 as the minimum age.

  5. Conditionality issue: the service should work equally well without profiling. People must have a genuine alternative.

  6. The right to withdraw consent must be as easy as the act of giving it.

22
New cards

What are the rights of the data subject? (TARODEF)

  1. Transparency rights: right to be informed about what is happening with your data, including which rights you have and how to exercise them.

  2. Right of access: you can request from the controller a file containing all your personal data.

  3. Right to rectification: data must be corrected if inaccurate.

  4. Right to erasure (right to be forgotten): can be exercised if processing is unlawful, the purpose has been fulfilled, or consent is withdrawn.

  5. Right to object: if processing is based on public interest, you can object for any reason.

  6. Right to data portability: receive your data in a structured, machine-readable, commonly used format, and have it transmitted directly to another controller of your choosing. This prevents lock-in and makes it easier to switch services.

  7. Right not to be subject to fully automated individual decision-making: no decisions with legal or significant effects may be based solely on automated processing, unless required or authorised by law, based on explicit consent, or necessary due to the scale of processing (with appropriate safeguards). If a human is involved, their role must be meaningful, not token. The data subject also has the right to contest such decisions and to receive an explanation of the algorithm — the 'explainability' issue.

23
New cards

What are remedies for data protection?

  • Public law remedies — supervisory authorities: independent administrative bodies (outside the normal hierarchy of public administration) with investigative powers, the ability to act without a complaint, advisory and authorisation roles, and the power to issue sanctions. Representatives from each national authority sit on the European Data Protection Board, which issues non-binding but influential guidelines.

  • Private law remedies — damages: any person who suffers material or non-material damage has the right to claim compensation. The controller is always liable; the processor may also be liable (joint liability). The controller must prove either that they did not violate the GDPR, or that the violation was not caused by them.

24
New cards

What is the scope of the AI ACT?

Applies broadly: if an AI system is used in the EU (regardless of provider location), or if deployers are established in the EU, or if output reaches the EU.

Exceptions: household use, scientific R&D, pre-deployment internal testing, and exclusively military/defence applications.

It uses a functional definition of AI rather than a technical one, to remain durable: 'a machine-based system that can have different levels of autonomy and that may exhibit adaptiveness after deployment... infers how to generate outputs such as predictions, content, recommendations...' This covers diverse technical implementations.

25
New cards

What kind of AI systems are prohibited under the AI ACT (2024)?

  1. Systems using subliminal or manipulative techniques to materially distort behaviour in ways people would not otherwise choose.

  2. Social scoring systems: classifying people based on social behaviour or personality characteristics — whether real or inferred.

  3. Emotion recognition: AI that infers emotional states from biometric data. Prohibited in workplaces and educational institutions — places of power imbalance where employees/students cannot freely opt out and where such monitoring could influence evaluations without medical justification. Permitted for safety/security reasons (e.g. detecting driver fatigue).

  4. Criminal prosecution risk assessment: AI used to predict the likelihood of a person committing a crime based on personality traits, behavioural patterns, etc. — not facts directly linked to criminal activity. Cannot substitute judicial decisions, only support them (with full motivation of why the AI output was or was not followed). Also: real-time remote biometric identification by law enforcement in public spaces is prohibited — cameras may identify suspects after a crime, but not in real time. Surveillance is considered too invasive given privacy as a core value.

  5. Real-time biometric identification of people by law enforcement in public spaces. It can be:

    • untargeted scraping of facial images from the internet or databases to create facial recognition databases — prohibited;

    • AI biometric categorisation systems that use biometric data to infer special-category information (e.g. inferring political views from facial features) — prohibited.

It’s a three-tier structure:

  1. prohibited systems — banned outright;

  2. high-risk systems — allowed but subject to stringent obligations before market;

  3. all others — allowed with general transparency requirements.

26
New cards

What is the Digital Markets Act (2022)

Designed to ensure contestable and fair markets when platforms become too dominant (gatekeepers). Addresses two core problems brought by the network and lock-in effects:

  1. contestability (ability of rivals to challenge incumbents)

  2. unfairness (dominant players exploiting their position)

It defines the requirements to be considered a gatekeeper and their obligations.

27
New cards

What are the requirements to qualify as gatekeeper?

  1. provide a core platform service (search engine, social network, cloud, online advertising);

  2. have a strong quantitative market position: annual EU turnover of at least €7 billion, company value of at least €75 billion, and at least 45 million monthly active EU users;

  3. serve as an important gateway for businesses to reach end users;

  4. hold that position durably.

28
New cards

What are gatekeepers obligations?

  1. allowing users to access other services without going through them;

  2. not requiring subscription to access other services;

  3. providing advertisers with transparent data on ad performance.

29
New cards

What are the transparency obliga6ons imposed on platforms to protect business users?

  1. Terms and conditions must be understandable; platforms must disclose links to third-party services and data collection practices.

  2. Differentiated treatment (ranking, fees, data access favouring the platform's own products) must be disclosed.

  3. Mandatory complaint-handling systems on durable media; this is additional to judicial remedies.

  4. Conditions that can lead to suspension must be stated clearly in advance. Before terminating, the platform must send a motivated notice via a durable medium (not a pop-up). Business users get 30 days to organise before termination.

30
New cards

When does a liability exemption apply?

Platforms are generally not liable for user-posted content, provided they have no knowledge of illegal activity. The rationale: imposing liability would incentivise censorship and harm innovation.

31
New cards

What are the types of intermediary services?

  1. Mere conduit: transmitting information provided by users, with no modification or storage.

  2. Caching: temporarily storing content to make transmission more efficient.

  3. Hosting: storing user-provided content — this is where most liability questions arise.

32
New cards

When does the “actual knowledge” rule arise in platforms liability?

  • Public or private notice

  • The platform proactively discovers illegal content

Platforms that proactively look for illegal content do not become liable for having done so — this is to encourage moderation.

Trusted flaggers: independent organisations with expertise in specific areas (e.g. child protection, cybersecurity) whose notices are processed with priority.

33
New cards

What is intellectual property?

It protects intangible property with economic value derived from intellectual work. Two interests in tension:

  • the author's interest (moral: attribution; economic: profit)

  • the collective interest in access to creative works.

Main categories: copyright, patents, trademarks. All involve a paid public domain — works are available to the community under certain conditions.

34
New cards

What are trademarks?

Signs that distinguish one company's goods/services from another's. To be registered, it must be new and distinctive — it should not be a common descriptive word for the product.

Its functions are:

  1. quality assurance, exclusive use of identical signs,

  2. protection against unfair competition (confusing similarity),

  3. protection against misappropriation (using a similar sign for unrelated products).

Protection extends to online domain names (cybersquatting/typosquatting) and to keyword advertising.

35
New cards

What are patents?

They protect solutions to technical problems.

Requirements: novelty and an inventive step (the solution must not be obvious).

Duration: 20 years from registration.

Licensing is permitted (including compulsory licences in exceptional cases such as public health crises).

36
New cards

What is copyright?

It protects the form of expression of creative works — not the underlying idea.

No registration required; protection arises from the act of creation.

Duration: 70 years after the author's death (Italy).

Moral rights (attribution, right to refuse modification) can never be transferred. Economic rights (reproduction, distribution, public performance) can be licensed or assigned.

37
New cards

Can a software be protected by copyright?

It is excluded from patentability for policy reasons (monopoly risk), so it is protected by copyright instead: decompilation is prohibited to prevent trivial circumvention.

Exception: software with a further technical effect beyond normal hardware–software interaction may be patentable.

38
New cards

What is accountability?

Controllers must be able to demonstrate compliance if a supervisory authority asks. They should conduct a risk assessment considering the probability and consequences of something going wrong for data subjects' rights and freedoms. A data protection officer may be required in certain cases.

Some tools include:

  1. maintaining records of processing activities;

  2. appointing a Data Protection Officer (mandatory for public bodies and high-risk processing);

  3. codes of conduct (voluntary rules adopted by industry associations that demonstrate compliance);

  4. certification schemes, a set of specific criteria and requirements approved by a national supervisory authority.

39
New cards

What must be proven to get damages from private law remedies?

The data subject must show a link between the damage suffered and the controller's conduct. A mere technical violation without actual harm does not automatically give rise to damages — but a loss of control over personal information is considered sufficient.

Non-material damages (e.g. distress, reputational harm) are recognised without requiring a minimum threshold of seriousness, per the European Court of Justice — unlike the Italian general approach, which would normally require the harm to meet a minimum level of seriousness.

You can go to the supervisory authority for injunctions (free, no lawyer needed), to a judge for damages, or pursue both in sequence.

40
New cards

What is Web 1.0?

During this era, the internet was primarily an information portal where a small number of creators published content for a vast majority of users who simply consumed it.

Platforms extended the reach of existing businesses to a wider audience without changing the underlying business model. This was followed by the dot-com crisis, which revealed the economic fragility of early internet businesses.

41
New cards

What are multi-sided markets?

platforms serve two or more distinct user groups simultaneously (e.g. consumers and advertisers, buyers and sellers). This enables the transformation of goods into services (streaming vs. buying DVDs).

42
New cards

What is the network effect?

a platform's size becomes its own growth driver.

Each new user adds value for others at very low marginal cost (unlike manufacturing, where more units = more cost).

43
New cards

What is the lock-in effect?

once you are on a platform, switching is costly — your audience, data, and connections are all there. This reinforces competitive advantages and data accumulation.

44
New cards

45
New cards

What are IP violations specific to domain names?

For domain names, they can be:

  1. cybersquatting, registering a domain in bad faith to profit from someone else's trademark

  2. typosquatting, registering misspelled versions of well-known domains

46
New cards

What is Text and Data Mining (TDM) and when does it require authorisation?

the automated analysis of large amounts of digital information to find patterns, trends, and knowledge — the basis of AI training.

Technically, you need a copy of the document, which triggers copyright. Exceptions exist for scientific research and for content to which you have lawful access.

Rights holders can prohibit it, but only if they do so in an effective, machine-readable way — burying an objection in terms and conditions is not sufficient.

47
New cards

Does TDM require a copyright license?

Technically, you need a copy of the document, which triggers copyright. Exceptions exist for scientific research and for content to which you have lawful access.

Rights holders can prohibit it, but only if they do so in an effective, machine-readable way — burying an objection in terms and conditions is not sufficient.

48
New cards

What are oprhan works and out of commerce works?

  • works still protected by copyright, but whose author cannot be found. A diligent search must be made before using them.

  • works still under copyright but no longer commercially available. A single EU-level portal (EUIPO) facilitates the exchange of information about such works.

49
New cards

What counts as personal data?

Any information relating to an identified or identifiable natural person — the 'data subject'. A person is identifiable if they can be identified directly or indirectly, in particular by reference to an identifier such as a name, an ID number, location data, an online identifier (e.g. an IP address or cookie), or one or more factors specific to their physical, physiological, genetic, mental, economic, cultural, or social identity. Legal persons are excluded.

The definition is intentionally broad. Even data that does not name someone directly can be personal data if it can be combined with other information to identify them. Pseudonymised data (where the identifier is replaced with a code) is still personal data, because re-identification is possible. Anonymised data — where identification is genuinely impossible — falls outside the GDPR.

50
New cards

When does data “relate to” a person?

Like the definition of what constitutes personal data, this is also intentionally broad: data relates to a person if it concerns them in terms of

  • content (it is about them),

  • purpose (it is used to assess or influence them),

  • result (its processing affects them).

51
New cards

What is the European Data Strategy?

Adopted by the European Commission in 2020, it aims to create a single European data space — a genuine internal market for data where data can flow freely within the EU and across sectors.

It rests on two observations:

  1. vast amounts of valuable data (especially industrial and public sector data) are currently not being used or shared;

  2. Europe risks falling behind the US and China in the data economy if it does not act.

The underlying philosophy is that data should be available to many — not locked up with a small number of large platforms — while individuals retain control over their personal data under the GDPR framework.

52
New cards

What are the main instruments of the European Data Strategy

  1. Re-use of Public Administration Data: Public bodies must make their large amounts of data available for private companies in an anonymized and aggregated form, usually for a low fee or no fee.

  2. Data Intermediation & Altruism: Promoting "data cooperatives" or intermediary services to share data, and encouraging "data altruism" (voluntary sharing for public interests like healthcare or climate change).

  3. Connected Products (IoT): Makers of connected products must share data so third parties can offer repair and maintenance services, which prevents monopolies.

  4. Gatekeepers: Dominant platforms are obliged to provide real-time transaction data to their business users.

  5. Exceptional Public Need: Private companies can be required to share data with public authorities in emergencies, like a pandemic.

  6. European Data Spaces: The creation of common, European pools of data in strategic sectors like healthcare, finance, and mobility.

53
New cards

What is digital protectionism?

The use of data regulation — particularly rules on data localisation and cross-border data transfers — to keep data within a jurisdiction, effectively creating barriers to the free international flow of information.

The EU has pushed back against the protectionism label by arguing that its rules are rights-based, not market-based. However, the practical effect — particularly for transfers to the US— has been to create real friction for international data flows.

54
New cards

What are special categories of data?

Formerly called 'sensitive data': health data, genetic data, biometric data (when used to uniquely identify someone), political opinions, religious beliefs, sexual orientation, trade union membership. These have historically been used to discriminate — hence stricter rules.

Biometric data: personal data resulting from specific technical processes related to physical, physiological, or behavioural characteristics that allow unique identification. Photographs may become biometric data if processed to extract identifying characteristics — otherwise they are not automatically special-category.

Browsing habits, surfing data, or health-related searches can constitute special-category data if they reveal sensitive characteristics or are used to profile someone on that basis (e.g. targeted pharmaceutical advertising based on a health search). Context matters. These data has a more specific list of legitimate bases for processing — no general clause; the legislator enumerates them.

55
New cards

Can data be re-used in the public sector?

yes, the Open Data Directive requires certain public data to be made available in machine-readable formats, free of charge where possible.

High-value datasets must be freely accessible. Public administrations cannot make exclusive deals with individual companies. Personal data must be anonymised and aggregated before release.

56
New cards

Can private actors share data?

Yes, they are obliged:

(1) gatekeepers must provide business users with real-time data on transactions involving them;

(2) IoT/connected product makers must share data so third parties can offer repair and maintenance services, preserving competition;

(3) in exceptional public need (e.g. pandemic monitoring), private companies may be required to share with public authorities. Companies that share data cannot use it to build competing products from the data they receive.

57
New cards

What are some risks of AI systems?

  1. Bias in input data: two types.

    1. simply incorrect data — correctable.

    2. data that accurately reflects a biased society — the model will replicate and reinforce existing patterns of discrimination even if the data is technically accurate. Legislative response differs depending on which type of bias is involved.

  2. Manipulation and nudging: AI systems can detect vulnerabilities in individuals through statistical profiling and exploit them to influence behaviour. Most marketing AI does not cause 'significant harm', but edge cases exist (e.g. a music service that recommends depressive music to someone already struggling).

  3. Black box problem: the path from human input to AI output is often unclear — even to technicians. This makes decisions difficult to explain or challenge, and creates a reliability bias: outputs are perceived as objective and therefore over-trusted.

58
New cards

What are the high-risk AI systems?

Two groups:

  1. AI systems used as a safety component in products already subject to EU safety legislation (e.g. medical devices, aviation, vehicles, toys, elevators, industrial machinery). These require the AI to meet the same safety standards as the product itself.

  2. AI used to take or support decisions with significant impact on people in eight specific domains:

    1. education — admission, monitoring, evaluation of students;

    2. employment — recruitment, contract decisions, performance evaluation;

    3. essential private and public services — banking, insurance, healthcare, public benefits eligibility;

    4. migration, asylum, and border control;

    5. justice and democratic processes — judicial decisions, elections;

    6. law enforcement generally (including lie detectors);

    7. any remote biometric identification system;

    8. AI as components in critical infrastructure (e.g. traffic management).

59
New cards

What are the obligations for high-risk AI systems

  1. conformity procedure (CE marking) before going to market;

  2. a risk management system adopted by both providers and deployers, maintained throughout the lifecycle;

  3. data quality requirements for training, validation, and testing datasets;

  4. logging/traceability — systems must record events during operation;

  5. transparency to users — sufficient information to correctly assess and understand the limits of outputs;

  6. AI literacy — users must be given adequate information about how the system works and its limitations (employers are responsible for training employees);

  7. human oversight — interface tools must be built in; accuracy and cybersecurity requirements.

60
New cards

What is the data quality requirement for high-risk AI system?

Training, validation, and testing datasets must be complete, representative, accurate, and free of errors to the extent possible for the intended purpose. Known biases that could produce discriminatory outputs must be identified and addressed.

Representativeness: if training data over-represents certain groups, the model performs better for those groups. A facial recognition system trained predominantly on light-skinned men will have higher error rates for women and darker-skinned individuals — a form of algorithmic discrimination.

Data quality also means not encoding historical biases: if past hiring decisions were discriminatory, training an AI to replicate them perpetuates that discrimination even if the data is 'accurate'.

It is not always easy to assess whether datasets meet these quality requirements — standards bodies are currently developing practical guidance.

61
New cards

What are regulatory sandboxes?

Early-stage legislation risks slowing development. These allow companies to test AI systems in real-world conditions under proper regulatory supervision by an EU public body. The body approves, sets limits, and controls the process. This prevents full regulatory requirements from blocking development before a product is ready. A procedure also exists for testing in real-world conditions.

62
New cards

What rights are retained in lawful automated decision-making?

  1. right to meaningful human review (not just nominal human presence);

  2. right to express one's point of view before a final decision;

  3. right to an explanation of the algorithm's logic and significance of the output (the 'right to explanation' in the AI Act, covering fields like employment, education, and law enforcement where significant impacts occur).

63
New cards

When is automated processing allowed under the GDPR?

no decisions with legal or significantly impactful effects may be based solely on automated processing — unless:

  1. required or authorised by law;

  2. based on explicit consent;

  3. necessary due to the volume of data (e.g. a small company processing thousands of CVs).

64
New cards

What are the consumer transparency obligations?

  1. Platforms must not use dark patterns in UI design

  2. cancellation must be as easy as subscription;

  3. incessant prompts are prohibited.

  4. At least one option not based on user profiling must be available (for very large platforms, an opt-out from profiling).

  5. Ranking obligations: parameters for search results must be disclosed to consumers; sponsored content must be explicitly labelled; if the platform is also a commercial actor whose products appear in rankings, this must be disclosed.

  6. Review transparency: platforms must disclose whether or not they verify reviews — not obliged to verify, but cannot misrepresent their practice. (TripAdvisor was sanctioned for claiming its reviews were verified while employing only five reviewers in Europe.) Platforms may check whether a reviewer is a real person (not a bot), but cannot represent unchecked reviews as genuine.

65
New cards

What are VLOPs and what are their obligations?

Very large online platforms and very large online search engines (45 million+ monthly EU users) face the most stringent obligations under the Digital Services Act (2024):

  1. Annual systemic risk assessment: identify, analyse, and assess systemic risks — dissemination of illegal content, negative effects on fundamental rights, threats to public security or electoral processes.

  2. Risk mitigation measures: adjust recommendation algorithms, restrict advertising in certain contexts, strengthen moderation.

  3. Independent annual audits: at their expense.

  4. Data access for researchers: vetted researchers and public authorities may access data to scrutinise systemic risks.

  5. Crisis response: the European Commission can impose specific obligations during public emergencies to limit spread of harmful content.

66
New cards

What are the exemptions for free use of copyright?

  1. quotation for criticism/commentary (with attribution, proportionate extent);

  2. parody, caricature, pastiche (mandatory EU exceptions — no authorisation or attribution required, must be noticeably different from and not discriminatory toward the original);

  3. news reporting; educational use in secure online environments (e.g. posting articles to a university Moodle).

67
New cards

What is database protection?

They receive two independent layers of EU protection:

  1. copyright protects the structure, if the selection or arrangement of contents reflects genuine intellectual creativity. This requires originality, not just investment

  2. the sui generis database right — uniquely European, independent of copyright — protects the substantial investment in obtaining, verifying, or presenting the contents of a database, regardless of whether the result is creative. Prevents extraction or re-utilisation of a substantial part without consent. Duration: 15 years (resets if substantial new investment is made).

Spinoff databases (created as a by-product of the main business, e.g. Ryanair's flight schedules) are not protected by the sui generis right, as building them was not the primary investment.

68
New cards

What are the requirements for private and public notifices to make a platorm aware of illegal content?

  • Under the EU Digital Services Act (DSA), a valid private notice must include:

    1. Explanation: Clear, substantiated reasons why the content is considered illegal.

    2. Exact Location: Specific URL(s) enabling the platform to find the content.

    3. Contact Info: Name and email address of the notifier (except for cases involving child sexual abuse material).

    4. Good Faith Statement: A declaration confirming that the notifier believes the provided information is accurate and complete

  • Public entities, watchdogs, or NGOs can be designated as "Trusted Flaggers." Their notices must be processed by platforms with priority and without delay, but they still have to fulfil the same requirements as private notices

69
New cards

What is content moderation?

Under the DSA, all activities undertaken by online platforms to monitor and manage user-generated content.

Moderation can be done using automated tools (like algorithmic scanning and filtering) or through human review. It can also be proactive (the platform looking for issues) or reactive (responding to notices or orders).

70
New cards

What are the obligations for internal complaint handling systems?

  1. Statements of Reasons: it must provide the user with a clear, specific explanation of why the decision was made.

  2. Appeals and Redress: it must provide an easy-to-use, free internal complaint-handling system so users can challenge and appeal.

  3. Fair Enforcement: it must enforce their Terms and Conditions objectively, proportionately, and non-arbitrarily

  4. Transparency Reporting: it must publicly report detailed data on their moderation activities

71
New cards

What are “notice and action mechanisms” regarding the content moderation activity of platforms?

The term refers to the mandatory reporting pipeline platforms that must be implemented to handle illegal content:

  • The Notice: Platforms must provide an easily accessible, user-friendly, and strictly electronic channel that allows any individual or entity to report (flag)

  • The Action: Once a valid notice is submitted, the platform is considered to have "actual knowledge" of the potentially illegal content.

  • Closing the Loop: The platform must confirm receipt of the notice to the person who flagged it and eventually inform them of the final decision, including their options for appealing it.

72
New cards

Do platforms have an obligation to monitor user-uploaded content?

No. Under the DSA, there is no general monitoring obligation. Platforms are not required to proactively search for illegal content — this would be technically burdensome and risk over-censorship.

However, they must act once they acquire actual knowledge (through a valid notice or proactive discovery), and VLOPs must conduct systemic risk assessments.

73
New cards

What are the requirements of orders from a public authority to a platform to act against illegal content?

Under the DSA, public authority orders must:

  1. be issued by a judicial or administrative authority of the member state;

  2. state the legal basis clearly;

  3. identify the specific content concerned (URL);

  4. explain the reasons why the content is illegal;

  5. be addressed to the platform in a language it declared it accepts;

  6. allow the platform to challenge the order before a court.

The platform must inform the user whose content is removed and of their right to redress.

74
New cards

What does it mean that gatekeepers must "reallocate the value of data"?

Gatekeepers accumulate vast amounts of data generated by business users and end users on their platforms. The Digital Markets Act requires them to share this data with the business users who generated it — in real time, free of charge, in machine-readable format.

This prevents gatekeepers from hoarding data as a competitive advantage and rebalances the relationship with business users who depend on the platform.

75
New cards

What protections do businesses have against gatekeepers?

The DMA introduces several protections:

  1. Interoperability: users can use competing services and connect them to the gatekeeper's platform;

  2. No self-preferencing: gatekeepers cannot rank their own products/services more favourably;

  3. No tying: access to a core platform service cannot be made conditional on subscribing to other services;

  4. No exclusivity: business users are free to offer the same products/services on competing platforms at different prices;

  5. No use of third-party data: gatekeepers cannot use data from business users to compete against them.

76
New cards

Are trademarks protected regarding the use of the sign as an online search keyword?

Yes, but within limits.

The CJEU has addressed this through Google AdWords cases. Using a competitor's trademark as a paid search keyword is infringing only if the resulting ad does not allow an average internet user to easily determine the commercial origin of the goods or services. If the ad clearly identifies a different origin, it is generally permitted — it is a form of comparative advertising rather than misappropriation.

For example, what if a generic bag company buys the keyword "Vuitton bag" so their ad appears when a user searches for Louis Vuitton?

77
New cards

What are the different interests underlying patent protection?

  • The inventor's interest: receiving a temporary monopoly as a reward for their innovation and investment.

  • The public interest: gaining access to detailed disclosure of the invention (published in the patent register), which advances scientific knowledge and — after expiry — allows free use of the invention.

This bargain — monopoly in exchange for disclosure — is the philosophical core of the patent system.

78
New cards

What exclusive rights does the patent holder have?

  1. make, use, sell, import, and export the patented invention;

  2. license the patent to third parties (voluntarily, or in exceptional cases via compulsory licence);

  3. prevent any third party from exploiting the invention without authorisation.

79
New cards

Is it legitimate to link and embed copyright-protected material?

  • Hyperlinking to freely accessible content is generally permitted — no new communication to the public occurs.

  • Hyperlinking to paywalled or access-restricted content without authorisation may constitute infringement, as it circumvents a technical restriction.

  • Embedding (inline framing) follows the same logic: if the embedded content was uploaded with the rights holder's authorisation and remains freely accessible, embedding is generally lawful. If the content was uploaded without authorisation, embedding it may constitute an independent act of infringement, even if the embedder did not upload it.

80
New cards

How is the online press protected against news aggregators?

  • The General Rule: News aggregators (like Google News) cannot freely publish news; they must obtain a licence and pay to use the content.

  • The Legal Protection: Press publishers are granted a neighbouring (related) right over the online use of their publications. (Note: This is under Article 15 of the DSM Directive, lasting 2 years from publication).

  • The Reason: To fix a structural imbalance. Aggregators were taking advantage of the value and work created by online newspapers without bearing the costs of journalism.

  • The Exception: Minimal snippets (individual words or very short extracts) are generally allowed or require only a small payment, depending on the member state.

81
New cards

Can news aggregators publish news taken from online newspapers?

Only under licence. The publisher's right allows press publishers to authorise or prohibit the online reproduction of their journalistic content by aggregators.

The right lasts 2 years from publication. Minimal snippets (individual words, very short extracts used to identify the source) remain free.

This was a response to the structural imbalance between large aggregators (Google News) and traditional publishers who bear the cost of journalism.

82
New cards

How is copyrighted material protected on online content-sharing platforms?

Article 17 of the Digital Single Market Directive creates a specific regime: platforms that store and give public access to large amounts of user-uploaded content perform an act of communication to the public and therefore need a licence from rights holders. If no licence is obtained, the platform avoids liability only if it can demonstrate it:

  1. made best efforts to obtain authorisation;

  2. made best efforts to ensure unavailability of the specific unauthorised content (mandatory upload filter basically);

  3. acted expeditiously to remove or disable access upon receiving a valid notice (basically re-upload filters).

Failure to satisfy all three conditions makes the platform directly liable for infringement.

83
New cards

Are content-sharing platforms liable for unauthorised copyright uploads?

Yes, as a default under Article 17 DSM — platforms are now treated as performing their own act of communication to the public, not merely hosting.

They bear the burden of proving they qualify for the exemption (best efforts to license + best efforts to block + swift takedown).

This is a significant shift from the earlier hosting safe harbour, where liability required actual knowledge.

84
New cards

When can users upload copyrighted material without authorisation?

  • Quotation for purposes of criticism or review;

  • Parody, caricature, and pastiche — no attribution required, work must be recognisably different;

  • Educational use in secure online environments.

85
New cards

Can cultural heritage institutions copy works in their collections?

Yes. Under the DSM Directive, libraries, museums, archives, and similar institutions may make copies of any work permanently in their collection for the purpose of preservation — without authorisation and without payment.

This covers digitisation. The copies may not be made publicly available beyond what is permitted by other exceptions (e.g. research/study terminals on the premises).

86
New cards

Are AI-generated creations copyrightable?

As a general rule, no. Copyright requires a human author — it protects the expression of human intellectual creativity. A work produced entirely by an AI without meaningful human creative input does not meet this requirement and falls into the public domain immediately.

The harder question is AI-assisted works: where a human makes sufficiently original creative choices in prompting, selecting, or arranging AI outputs, copyright may arise in the human's contribution. The threshold of what counts as sufficient human creativity is still being worked out by courts and regulators.

Explore top flashcards

flashcards
Astronomy Science
63
Updated 934d ago
0.0(0)
flashcards
Fr. 4: Les Vêtements
35
Updated 1056d ago
0.0(0)
flashcards
PID Part 1
69
Updated 472d ago
0.0(0)
flashcards
AP Biology Unit 6
79
Updated 202d ago
0.0(0)
flashcards
ASD4 Cap 3
35
Updated 1154d ago
0.0(0)
flashcards
World History - Imperialism Test
53
Updated 1101d ago
0.0(0)
flashcards
Cerebellum
46
Updated 1032d ago
0.0(0)
flashcards
Astronomy Science
63
Updated 934d ago
0.0(0)
flashcards
Fr. 4: Les Vêtements
35
Updated 1056d ago
0.0(0)
flashcards
PID Part 1
69
Updated 472d ago
0.0(0)
flashcards
AP Biology Unit 6
79
Updated 202d ago
0.0(0)
flashcards
ASD4 Cap 3
35
Updated 1154d ago
0.0(0)
flashcards
World History - Imperialism Test
53
Updated 1101d ago
0.0(0)
flashcards
Cerebellum
46
Updated 1032d ago
0.0(0)