Core Values and Value Conflicts in Cybersecurity
Value Clusters
Values can be seen as varieties of goodness, naturally leading to the assumption of a plurality of values.
Value monism posits one overarching value (e.g., human happiness or dignity) to which all other values are related or reduced.
Value pluralism, the opposite thesis, assumes a variety of irreducible values.
It is debatable whether there's a limit to the number of values we can discern, or if it's always possible to discern additional ones.
Value Clusters
A value cluster is a range of values expressing similar moral concerns.
Values in a value cluster correspond to similar moral reasons for action or similar norms.
These values are typically articulated in response to similar morally problematic situations.
The notion of value cluster is relative to a particular domain or societal activity, such as cybersecurity.
Value Clusters in Cybersecurity
Security
A first value cluster in relation to cybersecurity is that of security.
Security can be understood in specific ways (individual security, national security).
Values closely related or instrumental to security include cybersecurity, information security, and the confidentiality, integrity, and availability of data.
This value cluster corresponds to the protection of humans and valuable entities against harm.
It addresses morally problematic situations like data breaches, loss of data integrity, cybercrime, and cyberwarfare.
Privacy
A second relevant value cluster is privacy.
This cluster contains privacy, moral autonomy, human dignity, identity, personhood, liberty, anonymity, and confidentiality.
Values in this cluster correspond to reasons and norms such as treating others with dignity, respecting moral autonomy, and obtaining informed consent for data use.
It addresses morally problematic situations like secret collection of personal data or unauthorized transfer of personal data.
Fairness
A third cluster is fairness.
This consists of values such as justice, fairness, equality, accessibility, freedom from bias, non-discrimination, democracy, and the protection of civil liberties.
This cluster responds to the unequal effects of cybersecurity threats or measures.
These values respond to the potential undermining of democracy or civil rights and liberties by cybersecurity measures.
It emphasizes treating people fairly and equally, and upholding democratic and civil rights.
Accountability
The fourth value cluster is accountability.
Values in this cluster include transparency, openness, and explainability.
This cluster is relevant because cybersecurity measures can harm others, requiring accountability.
Accountability is particularly relevant because cybersecurity measures often require balancing conflicting values.
Reasons related to accountability include the obligation to account for actions, being blamed for unjustified behavior, and paying damages for harm.
Security
Security can be understood as the state of computer systems being free from cyber threats.
Varieties of security include personal security, national security, and the security of businesses.
These different types of security often correspond to distinct values that may conflict.
The various security values belong to one value cluster because they fit the same general conceptualization of security.
They are responses to situations in which something valuable is threatened by an external danger.
They correspond to moral reasons for protecting what is of value against an external threat or danger.
General Terms of Security
Security is the state of being free from danger or threat.
The security of X from Y is the state of an entity X being free from danger or threat of kind Y.
X can refer to an individual agent, a person, or collective social entities.
X may also refer to a technical system, such as a computer system.
Y can refer to specific types of danger or threat.
Personal physical security: Y refers to physical dangers or threats.
National security: Y may refer to terrorist attacks or an invasion by a foreign country, but nowadays also to (foreign) cyberattacks.
Safety vs. Security
Safety is protection against accidental danger (e.g., a collapsing bridge).
Security is protection against intended harm (e.g., theft or a terrorist attack).
Cybersecurity refers to the protection of networks and information systems against human mistakes, natural disasters, technical failures, or malicious attacks.
This includes unintentional as well as intentional harm.
Negative vs. Positive Security
Security can be understood as the absence of danger or threat.
Personal security may also be understood as a certain peace of mind and the presence of preconditions for a meaningful and happy life.
Negative freedom (absence of interference) vs. positive freedom (presence of opportunities).
Adherence to the negative characterization of security, as that seems most important when it comes to cybersecurity.
The positive aspect seems important for understanding the moral importance of security in certain contexts.
Instrumental and Intrinsic Values
Instrumental values are valuable because they contribute to something that is valuable.
Intrinsic values are believed to be good in themselves.
Cybersecurity was in most cases described as an instrumental value.
Computer systems are valuable because of their functions in society or for individuals and groups, and because of their economic value.
Computer systems may also be used for bad purposes, and, in such cases, cybersecurity may even be deemed undesirable.
Information Security
A value that is closely related to cybersecurity is information security.
This value is often understood in terms of the confidentiality, integrity and availability of information.
Confidentiality prevents unauthorized access to information, which is often essential in maintaining privacy.
The integrity and availability of information are instrumental for the purpose of the information system.
Moral Importance of Cybersecurity
Cybersecurity threats to heart monitoring devices in hospitals or aviation systems may lead to a loss of human lives.
Cybersecurity is important for the protection of a large number of human and moral values.
For some contexts, cybersecurity may be a sine qua non for upholding other values with great moral importance, including values of personal security and health.
What is a means in one context may well be an end in another (and vice versa).
Personal Security
Several authors have argued that personal security is an intrinsic value.
Without some degree of personal security, individual people do not have a life at all, let alone a meaningful and happy one.
Some degree of security is required for individuals to live a good life.
It is merely an enabling value; i.e. a value that is necessary for people to have a meaningful life and to acquire other values.
A life that merely consists of the absence of threat seems hardly worth living.
Collectivist notions of security such as national security or business and organisational security would seem to be instrumental values.
National Security
Discussions of national security may create a slippery slope, as it allows certain political groups the possibility to claim the moral importance of certain restrictive measures that in practice restrict individual values, including personal security, rather than support them.
Collectivist notions of security such as national security seem to derive their moral importance from how they eventually impact the security, but also other values such as privacy or liberty, of individuals rather than being intrinsically valuable.
Privacy
Privacy is generally seen as an important value in relation to cybersecurity.
Proposed understandings include such notions as “the right to be let alone”, “informational control”, an extension of personality and personhood and an act of self-care.
Privacy also has several dimensions.
Koops et al. (2017) distinguish between bodily, intellectual, spatial, decisional, communicational, associational, proprietary and behavioural privacy and view informational privacy as crosscutting through these categories.
Where cybersecurity is concerned, privacy is usually understood in informational terms.
Such informational privacy is about what information about a person is (not) known to, or shared with, others.
A further distinction is between notions of privacy stressing the confidentiality or secrecy of data (and information) and those stressing control over what data (or information) is shared with whom.
Confidentiality vs. Control
Confidentiality: it might be best not to collect and store personal data in the first place to enhance privacy.
Control conception of privacy: the collecting, storing and sharing of data is not always problematic, rather privacy is about giving people control over the collection, storage and sharing of their own personal data.
Informed consent means that the collecting, storing and sharing of personal data require the deliberate and informed consent of the data subject.
People may thus also deliberately decide to share information about themselves with others.
For both the confidentiality and the control notion, privacy breaches may result from unauthorized access to data and, in this sense, cybersecurity is instrumental, if not crucial, to protecting privacy.
Contextual Integrity
What information is appropriate to share with whom may not only be dependent on the autonomous choices of individuals (as the control notion of privacy stresses) but also be different for various social spheres.
This idea is captured in the notion of privacy as contextual integrity (Nissenbaum 2004).
Privacy as Intrinsic vs. Instrumental Value
Some authors have argued that privacy is an intrinsic value, whereas others see it primarily as an instrumental one.
Those who tend to see it as an intrinsic value may point out that some degree of privacy is indispensable for (moral) autonomy.
Those who conceive of privacy as an instrumental value may object that what is valued here is not so much privacy in itself but rather what it allows or enables.
We might therefore conceive of privacy as an enabling value, i.e. as a value that is necessary as a precondition for a good life, but one that is not necessarily itself intrinsically valuable.
Reductionist vs. Non-Reductionist Accounts of Privacy
According to reductionist accounts, the moral importance of privacy is based on other values such as autonomy, human dignity and liberty.
Conversely, non-reductionists do not need to deny that privacy is related to a range of other values and part of a broader value cluster as we have called it, but they at least maintain that the value of privacy articulates moral considerations and corresponds to moral reasons that cannot, or at least cannot fully, be expressed by other values.
Van den Hoven, for example, has argued that privacy derives its moral importance from four types of moral considerations:
(1) prevention of information-based harm,
(2) prevention of informational inequality,
(3) prevention of informational injustice, and
(4) respect for moral autonomy
It is often helpful to unpack the other values and reasons that are implied when the value of privacy is articulated in concrete situations and debates.
This is so because it is frequently the case that what is at stake in such situations is not just the threat of unauthorized access to personal data but rather a range of broader moral concerns related to such values as autonomy, identity and liberty.
As indicated before, the value cluster of privacy also contains such values as moral autonomy, human dignity, identity, personhood, liberty, anonymity and confidentiality.
Some of the values have a more justificatory relationship to privacy, i.e. they articulate why privacy is morally important (such as moral autonomy, human dignity, identity, personhood and liberty), whereas others (such as anonymity, confidentiality and control) seem more instrumental for preserving privacy.
Conceptions of Privacy
There is a mutual relationship between how privacy is exactly understood and conceptualised and what other values are (more closely) related to it.
That in the US context, privacy is merely understood (and laid down in laws) in relation to liberty and in particular to moral concerns about government infringements in the personal life sphere of citizens.
He contrasts this with the European, primarily French and German, tradition in which privacy is more closely linked to human dignity and that stresses the relationship between people, so that privacy is also a concern between individuals, or between individuals and companies, rather than between citizens and the state.
Arguably, in the current age of information systems and big data, both conceptions are important when it comes to privacy concerns.
Fairness
The third value cluster relevant to cybersecurity is that of fairness.
This is a relevant value because both cybersecurity threats and measures to increase cybersecurity impact people differently, which may raise fairness issues.
This is connected to a range of other values such as equality, justice, non-discrimination and freedom from bias.
In addition, democracy is a relevant value because some cybersecurity measures may be so consequential and invasive that they require democratic legitimation rather than being the authority of private actors such as companies.
Justice and fairness are important values because cybersecurity measures typically come with costs and benefits that may be unequally distributed across the various actors involved.
Parts of these costs and benefits are financial and economic in nature, and a first question that will therefore arise is whether a certain proposed cybersecurity measure is worth the cost.
Strictly speaking, this is more a question about efficiency (i.e. the ratio between benefits and costs) than a question of justice and fairness (i.e. the distribution of costs and benefits).
The fact that costs and benefits are usually not equally distributed implies that even if from a societal point of view it is efficient or cost-effective to take certain cybersecurity measures, it is possible that for none of the actors involved are such measures also individually cost-effective.
Cost Externalization
As long as the costs (and other harm) due to the cyberattacks can be externalised (for example to the users of their services), it may not be cost- effective for the company to take certain cybersecurity measures.
However, such externalisation of costs may be considered unfair, which in turn may lead to the introduction of a legal obligation (by the government) for the company to compensate its customers for damages due to avoidable cybersecurity breaches.
Fairness and justice considerations do not only accrue to distributional effects but may also imply that people have a right to some minimal level of information access or even access to ICT services.
Given the crucial importance of information, and also of certain ICT services, in today’s society, we may question whether access to such goods and services should not become a basic right.
Cybersecurity and Human Rights
Perhaps, now or in the future, we should grant everybody the right to affordable, secure and accessible ICT services.
If such rights were introduced, it would also have implications for the minimal level of cybersecurity that should be guaranteed for everybody.
Fairness and justice may require impartiality but they would not seem to require that people are always or necessarily treated equally.
Non-discrimination may be a particularly important value for cybersecurity because it is known that ICT technologies may be vulnerable to bias, i.e. they may unjustifiably treat people differently on the basis of, for example, gender, race or marital status.
Such bias may be intentional, but it is often the unintended result of how such systems are designed and used.
Sources of Bias
Friedman and Nissenbaum (1996) discuss three sources of such bias:
pre-existing bias in human practices, institutions, and attitudes that is reified in computer systems;
technical bias (resulting from technical requirements and constraints);
emergent bias that emerges from the use of the system (e.g. use in another context than originally foreseen)
The increased use of big data and of self-learning algorithms has further increased the problem of bias
Algorithmic bias may, in particular, result when algorithms are trained with biased data sets, or on a limited group of people or cases.
Large-scale data collection for cybersecurity, therefore, is likely to also be vulnerable to bias if non-discrimination is not from the start considered in the design, training and use of relevant algorithms.
Democracy
The value of democracy is relevant to cybersecurity in a number of ways.
Cyberattacks may undermine the democratic process, as suggested by the 2016 US president elections, which witnessed the hacking of the Democratic Party, trolling and the spread of fake news.
It has also been suggested that cybersecurity measures, such as end-to- end-encryption, may protect democratic liberties such as freedom of speech
However, cybersecurity measures may occasionally also undermine democracy. A particular concern is the strategic use of cybersecurity by national governments for national security aims.
One is that it may undermine the civil liberties of citizens.
Second, because such use is by its nature often secretive, there may be a lack of democratic legitimacy.
A further concern is that government agencies that find cybersecurity weaknesses may strategically keep these secret in order to use them against other countries (or even against their own population).
It thus leads to fairness concerns because these societal actors have to bear the burden of the costs of cybersecurity threats that have not been revealed by government agencies.
Accountability
The value of accountability (and related values such as transparency, openness and explainability) is particularly relevant to cybersecurity in two types of situations.
One are situations in which someone (allegedly) harms someone else, or infringes on the rights of that person. In such situations, we typically hold the (alleged) perpetrator accountable.
The other are situations in which there is a power imbalance between two agents and in which the more powerful is in the position to introduce rules or measures that may harm the less powerful ones.
For example, governments and companies may be accountable to citizens and consumers for what cybersecurity measures they take even if there is not (yet) a suspicion of undue harm.
Responsibility
In the first type of situation, accountability is closely related to responsibility and its different meanings, such as blameworthiness, liability and obligation- responsibility.
An agent may be said to be accountable if there is a reasonable suspicion that that agent did something wrong or caused undue harm.
Accountability here implies an obligation to account for one’s actions and their consequences.
What sets the second type of situation apart from the first is that there is not (yet) a reasonable suspicion of wrongdoing.
Rather, the need for accountability is based on power imbalances.
Although such power imbalances exist in any society, they seem to be aggravated in today’s information society by the unequal access to large amounts of data and information.
Moreover, citizens and consumers seem increasingly dependent on government and large commercial organisations for the secure storage of (personal) data.
This would seem to imply that such powerful organisations are accountable for what cybersecurity measures they take.
Such accountability would imply some degree of transparency about what cybersecurity measures are taken.
In addition to such transparency, it would also imply a willingness and ability to account for the decisions on which such measures are based.
Accountability here implies a certain traceability of how decisions are made but also the articulation of the reasons and motivations underlying such decisions.
Value Conflicts in Cybersecurity
It is often said that some of the values relevant to cybersecurity are in conflict with each other.
The most frequently mentioned conflict is that between security and privacy, but this is certainly not the only possible value conflict in the domain of cybersecurity.
Moreover, as already indicated in the introduction, it is not the case that (cyber)security and privacy are always in conflict.
Conflict Definition
What does it mean to say that two values are conflicting? If values are varieties of goodness and are used for (moral) evaluation, then one interpretation of a value conflict is that two (or more) values are conflicting if (and only if) they provide opposite or contradictory evaluations of the same state-of-affairs (or object or policy).
Therefore, if something is evaluated as good on the basis of one of the values it should, by definition, be bad on the basis of the other value.
In cybersecurity, the values of transparency (or openness) versus confidentiality may provide an example.
What is transparent is not confidential, and vice versa.
Practical Implications
Such value conflicts that seem to derive from oppositions at the semantic level of values are, however, relatively rare.
More often, value conflicts seem to derive from the practical implications of values.
Under this interpretation, values conflict if they express or correspond to contradictory norms or reasons for actions.
For example, if a value such as privacy would require that a certain piece of information is kept confidential, whereas transparency would require that same piece of information to be made public, then the values of privacy and transparency are conflicting.
Interpretation and Judgment
It should be noted that the question of to which reasons a value corresponds is one of interpretation and judgment, and depends both on the value at stake and the specific context.
More specifically, it depends on how the values at stake are conceptualised and specified
Conceptualisation is “the providing of a definition, analysis or description of a value that clarifies its meaning and often its applicability” (Van de Poel 2013: 261).
For example, privacy may be conceptualized in terms of confidentiality as well as in terms of control over information.
Moreover, whether values conflict will also depend on their specification. Specification may be understood as the translation of values into more specific norms and requirements
Specification of Values
If privacy is conceptualised in terms of confidentiality, a specification would further specify what (personal) information should exactly stay confidential, and to whom.
This means that on some specifications of privacy as confidentiality, privacy and transparency would conflict whereas on other specifications, the values would not conflict.
Values are conflicting for a particular X, in context C, if it is practically impossible to respond properly to all values that are relevant to X in context C simultaneously
Here X can be a state-of-affairs but also (and more relevant to the current discussion) a certain (technical or institutional) cybersecurity measure.
For example, for a particular cybersecurity policy it may turn out to be impossible to respect (which is a proper response) the value of privacy.
Policy Conflicts
If X is a cybersecurity policy (or measure), the natural response to such value conflicts may be to look for another policy, or measure, that does properly respond to all relevant values.
Van den Hoven, Lokhorst, and Van de Poel (2012) argue that in such situations of value conflict (or a moral dilemma), there is a second- order obligation to look for options that help to avoid the value conflict, now or in the future.
This may be done through technical or institutional innovation or design, as such innovation or design may extend what is feasible and so allow options that overcome the initial value conflict
Value Conflict Definition
A choice has to be made between at least two options for which at least two values are relevant as choice criteria.
At least two different values select at least two different options as best.
There is no single value that trumps all others as choice criterion. If one value trumps another, any (small) amount of the first value is worth more than any (large) amount of the second value.
It is this type of value conflict that we are concerned with.