4: Death of Privacy

Ethics

  • Ethics: a system of principles guiding decisions and behaviours

    • right vs. wrong; moral reasoning; universal human goodness.

    • humans as moral agents

    • Dalai Lama: survival is based on universal human goodness.

      • religion can inspire; but they are behaviours that anyone can exhibit.

  • Information ethics: relationship between creation, organization, dissemination, and use of information and the ethical standards and moral codes governing human conduct in society.

    • An ethical framework is critical as decisions can be based on corporate interests instead of morality.

  • New ethical considerations are created:

    • creates social change but also threatens power, money, rights and obligations distributions

    • creates social progress but also crime (cyber crime)

    • produces benefits for some and costs for others.

      • digital divide concept: rural & urban areas. Canada vs. other nations. Cannot participate in the information economy: affects info, autonomy and health.

GM Ignition Coverup

  • GM covered up faulty ignition switches, leading to 120+ deaths. Company controlled the flow of information — deciding who could see reports and how incidents were categorized.

    • Information systems can be instrumental in fraud and leads to information control.

    • Information systems used to perpetuate harm by concealing information: corporate information control.

  • Information ethics have become a great interest:

    • 1997: UNESCO on Ethical, Legal and Societal Aspects of Digital Information

      • draw information to information ethics and highlight that they are a universal human problem.

    • 2003: World Summit on the Information Society

      • UN came together to discuss the digital divide: how to work towards an inclusive information society? Internet access should be a basic human right

    • 2007: First African Conference on Information Ethics

      • ensure that African perspectives and values were included in the international dialogue on information ethics.

Modern Information Ethics Issues in Tech

  • Pre-ITS, there are well-known and obliged laws and expected conducts.

    • New information technology and information systems created ripples in modern society. New situations that haven’t been covered by old rules: they’ve all been shaken. Laws and politics don’t know how to respond.

      • Expectations? Social responsibility? Approved rules? Takes years to develop responses. Real harm may have to be exhibited. Everyone lives in destabilized, grey-area society.

  • ITS is affected by, and crossover individual, social, political and ethical life.

    • information rights and obligations

      • what information are individuals and organizations entitled to and responsible for

    • property rights and obligations

      • what traditional property rights should we protect in digital society? intellectual property rights?

    • system quality

      • what standards of data and system quality should we demand to protect individual rights and the safety of society?

    • accountability and control

      • who is responsible for the harm done by information systems? 

    • quality of life

      • what values should we preserve in an information and knowledge-based economy? social values and cultural effects of technology.

  • Technology trends

    • increasing computing power (advancements): more reliant upon technology and information systems. makes us more vulnerable. immediate effects of hacking, e.g. 2016 UofC randomware attack. retail.

    • declining cloud storage costs: more detailed records can be kept, can be more cost-effective, longer retention. raises privacy questions.

    • data analysis advancement: you are constantly generating information about yourself. obtains information about you to create profiles and target individuals. data mining, machine learning, data brokers, AI. consent, surveillance, privacy??

  • Institutions, organizations and individuals choose to use technology, and therefore are responsible.

    • In an ethical democracy, individuals can recover damages done to them through laws and due process.

    • Must exhibit social responsibility in the digital world.

Privacy

Richard Mason PAPA Framework (TEST)

4 ethical issues of the information age developed in 1986. Basis for information technology ethics studies.

  • Privacy: pertains to the boundaries of others and oneself, public and private lives. What information should a person be forced to reveal about themselves? How is that information safeguarded?

  • Accuracy: who is responsible for ensuring that information is authentic and correct?

  • Property: who owns information? how are people compensated?

  • Accessibility: how can we ensure equal access to information?

IT must be used to enhance mankind dignity. A new social contract is necessary with the fate of future generations being at stake.

A 2006 study determined that privacy was the most critical issue in the 21st century. All 4 ethical issues were still relevant.

Luxemburg Privacy Video

  • “Something I’m willing to do if no one is watching”

  • Snowden’s NSA leak: mass surveillance. → is there any real harm?

  • Nothing to hide and nothing to fear. Surveillance society.

  • Privacy actions vs. words.

  • When people are being watched, their behaviours change. They conform because of shame. What expectations do other people have for us?

  • The idea of that tower, that people can be watching us and we have to assume that they always are watching us.

  • Surveillance states where every sound you made was overheard and every movement scrutinized.

  • Reminds me of Fahrenheit 451. Renders off-limits behavioural choices.

  • Subjectivity of “bad persons”

  • How are dissidents treated in a society?

History of Privacy

  • Ancient Roots: concept traced back to Roman law and biblical texts

  • 18th century US: focus on freedom from physical institutions. Reflected in the fourth amendment.

  • 20th century: expanded to decisional privacy. Make decisions without state interference.

  • 21st century: shift towards information privacy, rise of global data protection laws (e.g. GDPR)

  • Privacy: the freedom of expression (UN human rights declaration), beliefs and activities

  • Vital for the development and maintenance of autonomy, dignity, civil rights, democratic participation, liberties and freedoms.

    • It’s different than tangible possessions or valuables. We don’t own it — it’s intrinsic.

Radical changes in IT has led to extreme quantities of individual information that can be collected, stored and analyzed for new use. High increase in information has made consumers producers of information. Highly personal data. Where are the boundaries between physical and digital?

  • Trails are now being generated and captured. Personal and professional activities are being traced.

  • Gatekeeper companies are in control. Links and tracks behaviours of billions of users. Actions, interests, desires and attentions are being collected by third parties such as advertisers. Usually, we don’t know or consent. 

    • This collected data is highly valuable economically. The most valuable resource in our information economy. Becomes a business asset that is used to target individuals. Age? Income? Gender? Ads which they interact with?

Invasive: face scans, drug testing, traffic surveillance, GPS car rental tracking. Biometric datapoints.

Threats to Privacy

  • Data-gathering: collect and record personal information, e.g. the census

    • Differences in scale, speed and invisibility. Data is collected continuously and automatically. Done without knowledge and consent.

  • Data-exchanging: transfer and exchange personal data between computer databases, again, without knowledge and consent.

  • Data-mining: analyzing large datasets to create consumer profiles and predict behaviour.

  • Older generations have a higher privacy concern. Younger generations are more willing to post information online. There is universal appeal to privacy — but there are differences in how we value individual privacy.

    • Westin: individual privacy is valued in western societies vs. China. China placed less value on individual privacy and more value on collective social values and security.

    • It’s difficult to derive universal agreement on privacy safeguarding.

In North America:

  • concern exists about how tech invades privacy.

  • Regan: we can’t conceptualize the idea of privacy in a way that sustains public interest and support. We don’t have a shared definition of what privacy is.

    • we don’t understand its value

    • we can’t create strong privacy laws

    • governments and companies can justify weakening privacy.

  • DeCew: a shift exists between looking at privacy as a right, towards a cost-benefit reasoning. We give up our privacy for benefits like societal safety and convenience.

  • Shifts in privacy interpretation:

    • Third Party Doctrine: no reasonable expectation of privacy for information you voluntarily share with a third party.

    • Notice and Consent Regimes: click and agree privacy policies

    • 9/11: if you’ve done nothing wrong, you have nothing to hide. Give up your privacy for increased safety. Accelerated cost-benefit logic.

Privacy then and now:

  • In the 1990s, privacy was viewed as a fundamental human right by governments and ISPs. They needed to regulate your privacy.

  • Now, privacy is an individual burden. You should know better! But this assumption assumes prior knowledge. Oversimplifies privacy issues. Developed mass surveillance. Develops issues with privacy policies.

Protecting Privacy

  • Can you meaningly opt out of mass surveillance? Sorj: technology is a requirement for citizenship.

  • We can limit the personal info we share online.

  • WE can educate others.

  • Support supporter privacy laws

  • Tell companies to respect privacy or lose business.

Digital citizenship: encompasses all aspects of online life. Ethical conduct and engagement when it comes to protecting private information.