1/30
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
difference GDPR and 1995 Data Protection directive
GDPR refers to the internet (internet, social networks, websites, links,…), focused on challenges emerging for the internet (not yet issues in 1995) but no mention yet of AI
art 3 GDPR
territorial scope
regulation applies to:
processing personal data inside or outside union by an establishment of a controller or a processor in the union
processing of personal data of subjects in the union by controller or processor not established in the Union when processing activities related to: offering of goods or services to these data subjects or monitoring behaviour of data subjects taking place within the Union
art 4 definition personal data
personal data = information relating to identified or indentifiable natural person (the data subject)
art 4 identifiability
=> identifiable natural person = can be identified directly/indirectly by reference to identifier such as name, identification number, location data, online identifier or one or more factors specific to physical, physiological, genetic, mental, economic, cultural or social identity of that natural person
conditions under which a piece of data which is not explicitly linked to a person, still counts as personal data, since the possibility exists to identify the person concerned
availability of means reasonably likely to be used for successful reidentification = take into account cost and amount of time, available technology and technological developments
art 4 definition pseudonymisation
data items that identify a person are substituted with a pseudonym
link between the pseudonym and the identifying data items can be retraced through additional information
still personal data because information on identifiable natural person
Regulation (EU) 2018/1807 of the European Parliament and of the Council of 14 November 2018
on a framework for the free flow of non-personal data in the European Union
Recital (9) of Regulation 2018/1807*
If technological developments make it possible to turn anonymised data into personal data, such data are to be treated as personal data, and Regulation (EU) 2016/679 is to apply accordingly.
non-personal data
not postitively defined in EU legislation
fex = aggregate and anonymised datasets used for Big Data analytics, data on precision farming that can help to monitor and optimise the use of pesticides and water, or data on maintenance needs for industrial machines.
2 issues AI and GDPR personal data definition
reidentification = re-personalisation of anonymous data
computational statistics = nonidentified data can be connected to concerned individuals
statistical correlations between nonidentified data and personal data concerning the same individuals => de-idenitfied hospital admission info contains birthday, sex and ZIP code so does identified set of personal data name address and phone
doesn’t need to be identified with absolute certainty, a degree of probability may be sufficient
"in any 'reasonable' setting there is a piece of information that is in itself innocent, yet in conjunction with even a modified (noisy) version of the data yields a privacy breach.”
examples: 2016 journalists reidentified german politicians in anonymized browsing history dataset uncovering medical info and sexual preferences, Australia publishing de-identified medical records which were reidentified 6 weeks later, anonymized taxi trajectories in NYC27, bike sharing trips in london, subway data in riga
solutions = deidentify data so that it is more difficult to reidentify and implementing security processes and measures for release of data contributing to outcome
inference of further information from available personal data
application of algorithmic models to personal data
should inferred info be considered new personal data or part of original data => if it is considered new personal data then inferences trigger all the consequences that processing of personal data entails according to GDPR => legal status inferred data?
ECJ joint cases C-141 and 372/12 = denied that the legal analysis, by the competent officer, on an application for a residence permit could be deemed personal data => only the data on which the analysis was based (the input data about the applicant) as well as the final conclusion of the analysis (the holding that the application was to be denied) were to be regarded as personal data. => intermediate conclusions human inferred personal data) not personal data
ECJ Case C-434/16 = candidate’s request to exercise data protection rights relative to an exam script and the examiners’ comments => examiner’s comments, too, were personal data => examinee has the right to access both to the exam data (the exam responses) and the reasoning based on such data (the comments), but he or she does not have a right to correct the examiners’ inferences (the reasoning) or the final result
The view that inferred data are personal data was endorsed by the Article 29 WP = in case of automated inference (profiling) data subjects have the right to access both the input data and the (final or intermediate) conclusions automatically inferred from such data.
art 4 definition processing
any (set of) operation(s) performed on personal data or sets of personal data, whether or not by automated means
art 4(2) definition profiling
data processing consisting in using data concerning a person to infer info on further aspects of that person
evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements
art 29 WP => aim profiling = evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements
AI classifier system trained to link predictors (certain features of individuals) to a target (another feature of same individuals) => creation of algorithmic model that can be applied to new cases => ex using health records, social conditions and habits to likelihood of heart disease of applicants for insurance, financial history, online activity and social condition to infer creditworthyness of loan applicant, risk of reoffence convicted person based on criminal history, character and personal background
maybe automated determinations of price of health insurance, granting of loan and release on parole
A learned correlation may also concern a person’s propensity to respond in certain ways to certain stimuli. This would enable the transition from prediction to behaviour modification
inferences and personal data
need to disinguish general correlations captured by learned algorithmic model and results of applying that model to the description of a particular individual
personal data used to train model but learned algorithmic model correlations apply to all individuals sharing similar characteristics so can be viewed as group data concerning the set of individuals => if algorithmic models applied to input data (description new applicant) => both the description of the applicant and default risk attributed by model represent personal data (collected and inferred)
rights over inferences
access
inferred data concerning individuals also are personal data => data protection rights should in principle also apply, though concurrent remedies and interests have to be taken into account
art 29 WP = data subjects have a right to access both the personal data used as input for the inference, and the personal data obtained as (final or intermediate) inferred output
rectification
applies to limited extent
processed by public authority => review procedures already exist which provide for access and control
processing by private controllers => right to rectify the data should be balanced with the respect for autonomy of private assessments and decisions
art 29 WP = data subjects have a right to rectification of inferred information not only when the inferred information is “verifiable” (its correctness can be objectively determined), but also when it is the outcome of unverifiable or probabilistic inferences (e.g., a the likelihood of developing heart disease in the future)
latter case, rectification may be needed not only when the statistical inference was mistaken, but also when the data subject provides specific additional data that support a different, more specific, statistical conclusion
proposed general right to reasonable inference = any assessment of decision affecting subjects is obtained through automated inferences that are reasonable, respecting both ethical and epistemic standards
entitled to challenge inferences made by AI systems, not just decisions based on inferences
inferences meant to lead to assessments and decisions affecting the data subject
criteria = acceptability (of input data, exclusion of prohibited features), relevance (inferred info relevant to purpose decision) and reliability (both input data and processing methods accurate and statistically reliable)
contollers prohibited to base assessment or decisions on unreasonable inferences, obligation to demonstrate reasonableness of inferences
art 4 definition controller
the natural or legal person, public authority, agency or other body which alone or jointly determines purposes and means of the processing of personal data
art 4 definition processor
natural or legal person, public authority, agency or other body which processes personal data on behalf of controller
art 5
principles relating to processing of personal data
lawfulness (art 6), fairness and transparency
purpose limitation
data minimisation
data accuracy
storage limitation
integrity and confidentiality
accountability principle
art 6
lawfulness of processing conditions
consent = data subject has given consent to processing for one or more specific purposes
necessity for performance of contract to which data subject is party or to take steps prior to entering into contract
legal obligation to which controller is subject
protect vital interests of data subject or other natural person
public interest or exercise official authority vested in controller
legitimate interests pursued by controller or third party unless overridden by interests or fundamental rights and freedoms data subject
art 4 definition consent
any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her
art 7 conditions for consent
controller able to demonstrate consent data subject
if consent in written declaration also concerning other matters => request consent clearly distinguishable from other matters, intelligible and easily accessible, clear and plain language
right to withdraw consent at any time => does not affect processing based on consent before withdrawal, as easy to withdraw as to give consent
performance of contract conditional on consent to the processing of personal data that is not necessary for the performance of that contract
art 13-14 + recital 42 + art 29 WP Guidelines on consent info to be provided to data subject
identity of controller (+representative) + contact details
contact details of data protection officer
purposes of processing
legal bais processing
categories of personal data concerned
period for which personal data will be stored
right to request from controller access to and rectification or erasure of personal data + right to data portability
right to lodge complaint with supervisory authority
source from which personal data originate
existence of automated decision-making including profiling
art 17
right to erasure
grounds:
personal data no longer necessary in relation to purposes for which they were collected or otherwise processed
subject withdraws consent and no other legal ground for processing
data objects to processing (art 21) and no overriding legitimate grounds
unlawful processing
compliance with legal obligation
…
if data made public, controller has to take reasonable steps to inform controllers processing data that subject has requested erasure
no erasure if processing necessary for
exercising right to freedom of expression and information
compliance with legal obligation or public interest or exercise of official authority controller
public interest in the area of public health
public interest, scientific or historical reasearch purposes or statistical purposes
exercise or defence of legal claims
art 9
processing of special categories of personal data
data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership + genetic data, biometric data + data concerning health or data concerning a natural person's sex life or sexual orientation => prohibited
allowed if:
explicit consent for one or more specified purposes
necessary for carrying out obligations and specific rights contoller or data subject in fields of employment, social security and social protection law
processing necessary to protect vital interests data subject or other natural person where data subject physically or legally incapable of giving consent
legitimate activities with appropriate safeguards by foundation, association or any other non-profit body with political, philosophical, religious or trade union aim on condition that the processing relates solely to the members
personal data manifestly made public by data subject
necessary for establishment, exercise or defence of legal claims or court acting in judicial capacity
substantial public interest
preventive or occupational medicine
public interest in area of public health
achieving purposes in the public interest, scientific or historical research purposes or statistical purposes
art 22
automated individual decision-making, including profiling
The data subjects shall have the general right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
four conditions for the application of art 22(1)
a decision must be taken = stance likely to be acted on taken towards person
it must solely be based on automated processing = humans do not exercise real influence on outcome decision-making process
it must include profiling
it must have legal or anyway significant effect = effects cannot be merely emotional, and that usually they are not caused by targeted advertising, unless “advertising involves blatantly unfair discrimination in the form of web-lining and the discrimination has non-trivial economic consequences
not if:
necessary for entering into or performance of contract between data subject and data controller => general right not to be subject to completely automated decisions significantly affecting data subject
authorised by union or MS law to which controller is subject and which lays down suitable measures to safeguard data subject’s rights and freedoms and legitimate interests
based on data subject’s explicit consent
art 21
(1) right to object also applies to profiling
(2) profiling in the context of direct marketing = personal data are processed for direct marketing purposes, the data subject shall have the right to object at any time to processing of personal data concerning him or her for such marketing, which includes profiling to the extent that it is related to such direct marketing
=> data subject does not need to invoke specific grounds when objecting to processing for direct marketing purposes, purposes cannot be “compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject”
=> Controllers should be required to provide easy, intuitive and standardised ways to facilitate the exercise of this right.
art 13 and 14 GDPR
controller has the obligation to provide
information on “the existence of automated decision-making, including profiling, referred to in Article 22(1)
meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject
research on explainable AI
=> explanatory techniques and models developed within computer science intended for technological experts and assume ample access to the system being explained
model explanation = the global explanation of an opaque AI system through an interpretable and transparent model that fully captures the logic of the opaque system (decision tree or set of rules whose activation exactly reproduces the functioning of a neural network)
model inspection = a representation that makes it possible to understanding of some specific properties of an opaque model or of its predictions
outcome explanation = account of the outcome of an opaque AI in a particular instance ex special decision concerning an individual can be explained by listing the choices that lead to that conclusions in a decision tree
=> objective of making explanations accessible to lay people, ex-post explanations of specific decisions by a system
contrastive explanation = specifying what input values made a difference, determining the adoption of a certain decision (e.g., refusing a loan) rather than possible alternatives (granting the loan)
selective explanation = focusing on those factors that are most relevant according to human judgement
causal explanation = focusing on causes, rather than on merely statistical correlations (e.g., a refusal of a loan can be causally explained by the financial situation of the applicant, not by the kind of Facebook activity that is common for unreliable borrowers)
social explanation = adopting an interactive and conversational approach in which information is tailored according to the recipient’s beliefs and comprehension capacities
=> ex-ante info for user
input data taken into consideration, which will favor/disfavor the outcome
target values that system is meant to compute
The envisaged consequence of the automated assessment/decision
overall purposes
safeguards to data subjects in case of automated decisions recital (71)
• specific information
• the right to obtain human intervention,
• the right to express his or her point of view,
• the right to obtain an explanation of the decision reached after such assessment
• the right to challenge the decision
art 22 safeguards to data subjects in case of automated
at least:
right to obtain human intervention
right to express his or her point of view
right to challenge the decision
2 interpretations right to explanation
only including request for specific explanation in recitals and not in articles GDPR => double message
exclude an enforceable legal obligation to provide individual explantations
recommending data controllers to provide explanations when convenient according to discretionary determinations = good practice not legally enforceable requirement
establish an enforceable legal obligation to provide individual explanation, though without unduly burdening controllers => hinted at by qualifier “at least” => explanation would be legally needed, whenever it is practically possible (technologies, costs and business practices)
=> cautioned against overemphasising a right to individualised explanations as a general remedy to the biases, malfunctions, and inappropriate applications of AI & Big Data technologies => likely to remain underused by data subjects, might lack sufficient understanding of technologies and applicable normative standards
art 25
data protection by design = controller implements appropriate technical and organisational measures in the processing, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation
==> taking into account state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing
data protection by default = controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed
==> amount of personal data collected, extent of processing, period of storage and accessibility
art 32
security of processing
controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate
pseudonymisation and encryption personal data
ensure ongoing confidentiality, integrity, availability and resilience of processing systems and services
restore availablility and access to personal data in timely manner in event of physical or technical incident
regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing
art 68 and 70 european data protection board and european data protection supervisor
art 68
legal personality as body of the union
represented by Chair
head of one supervisory authority of each MS + European data protection supervisor or respective representatives => joint representative in case of more than one supervisory authority
commission has right to participate in activities and meetings board without voting right, designated representative
art 70 = tasks board
ensure consistent application GDPR on own initiative or request commission
without prejudice to the tasks of national supervisory authorities
advise the Commission on any issue related to the protection of personal data in the Union
examine, on its own initiative, on request of one of its members or on request of the Commission, any question covering the application of this Regulation and issue guidelines, recommendations and best practices in order to encourage consistent application of this Regulatio