Avoiding Impediments to Ethical Engineering Practice
Absence of a Culture of Ethics
- What is Culture?
- Customs, arts, social institutions, and achievements of a particular country.
- Shared attitudes, values, goals, practices.
- In biology, culture is used to “cultivate” an environment enabling growth and maintenance of an organism.
- In social sciences, culture might focus on interpersonal relationships and how people learn to function in these relationships.
- In visual and performing arts, culture includes the forms used to convey different ideas and messages.
- All of these definitions and ideas have applicability to engineering ethical culture.
- Relationship to Basic Engineering Ethical Principles
- Engineers shall hold paramount the safety, health, and welfare of the public.
- Engineers shall issue public statements only in an objective and truthful manner.
- Engineers shall act for each employer or client as faithful agents or trustees.
- Engineers shall perform services only in the areas of their competence.
- Engineers shall uphold and enhance the honor, integrity, and dignity of the profession.
- A culture of engineering ethics supports all of these principles by creating an environment where they are recognized, prioritized, and made real through attitudes and actions.
- What is an Engineering Ethics Culture?
- Focus on health, safety, and well-being.
- Open communication highlighting issues.
- No fear of retribution or retaliation.
- Professional: Individuals can make suggestions without getting personal.
- What is NOT an Ethics Culture?
- Prioritizing other issues above engineering ethical requirements
- Example: The night before the Space Shuttle Challenger disaster, a Morton-Thiokol executive asked his head of engineering “To take off his engineering hat and to put on his management hat” – i.e., prioritize the business concern to please a customer (NASA) over safety.
- Intimidating or punishing engineers who make critical statements or decisions on ethical issues
- Example: Roger Boisjoly was an engineer at Morton-Thiokol who had tried on many occasions before the Challenger disaster to alert NASA to the hazards. His records and testimony were crucial to determining the cause of the disaster in the ensuing investigation. However, the company actually refused to allow him to participate in re-design of the rockets, and he was eventually forced out of the company.
- How to create an ethical culture?
- Set priorities of ethics above other issues and ensure everyone understands these priorities.
- Encourage communication, including critical questions, without fear of retaliation.
- Treat ethical concern as a positive aspect of the organization, not a negative one.
Confusion of Responsibility Versus Accountability
- What exactly do we mean by “Responsibility” and “Accountability”?
- Often treated as synonyms, and dictionary definitions usually reference each other.
- “Responsibility” refers to explicit expectations expected of a person to perform a task. It is imposed externally on an individual by some authority.
- “Accountability” is the assumption of consequences for actions and judgments that occur in your organization. It is internally generated and is a key component of leadership.
- Responsibility can be delegated, but Accountability can not.
- Relationship to Basic Engineering Ethical Principles
- Engineers shall hold paramount the safety, health, and welfare of the public.
- Accountable engineers realize that even small tasks are part of bigger projects with big potential impacts on the general public.
- Engineers shall act for each employer or client as faithful agents or trustees.
- Responsible engineers do what the client or employer tells them to do. Accountable engineers also think about the bigger picture of what’s best for the client or employer.
- Engineers shall uphold and enhance the honor, integrity, and dignity of the profession.
- Avoiding accountability and hiding behind “I fulfilled my responsibility” demeans the profession and encourages others to regard engineers as “just another group of workers.”
- What does this look like in engineering practice?
- Example 1
- Each and every member of an engineering team completes his/her assigned tasks on time, but the team doesn’t deliver the project on time because a subcontractor wasn’t informed of schedule changes. The failure to communicate with the subcontractor could be due to poor management procedures in place, a responsible team member leaving in the middle of the project, an internal issue with the subcontractor, or some other reason. However, no one on the team ensured that necessary communications were happening (regardless of whether it was their job or not). Everyone on the team felt responsible, but no one felt accountable.
- Example 2 (a real case)
- A company makes a bestselling safety harness for high-rise building window washers. The company’s chief engineer notices in some field visits that many workers, although legally required to wear the harnesses, remove them frequently because they interfere with quick raising and lowering of the scaffold. The engineer institutes a design revision process (which she will oversee) to make the harness more user-friendly. This engineer is both responsible and accountable.
- Example 3
- An engineering manager assumes the lead for a project started 3 years ago. After about 6 months he realizes that fundamental data needed for the project to succeed was never developed by the previous manager. Although it will delay completion and greatly reduce profitability (and his chance for a bonus and promotion soon), he resets the project timeline to wait on data development. This engineer is not responsible for the prior poor management, but he is now both responsible and accountable for the project’s ultimate success.
- What can you do to prevent negative consequences in engineering practice?
- Remember that accountability is internally determined – Don’t just assume that a person accepts accountability. He/she may not realize that others expect it of him/her.
- Communicate in your organization so that everyone knows who is accountable – Once this is known, others better understand requests for help that might seem outside this person’s responsibility.
- Know when to and when not to assume accountability – Every effort needs both leaders and followers, and you will act in both roles. Your career will include opportunities to do both, but these depend on experience, knowledge, reputation, networks, and other factors.
- Remember that you may not be directly accountable, but you can always help the person who is.
Communication Failures
- What exactly do we mean by a “communication failure”?
- Important information relevant to an organization’s ethical obligations or overall performance is known in one part of the organization but does not appropriately influence decisions in another part of the organization.
- “X” knows something important for “Y’s” decision but can’t bring it to Y’s attention.
- Relationship to Basic Engineering Ethical Principles
- Engineers shall hold paramount the safety, health, and welfare of the public.
- Failure to get relevant information to decision-makers can lead to death, serious injury or illness, or large economic losses.
- Engineers shall issue public statements only in an objective and truthful manner.
- Implication that we don’t withhold or fail to deliver important information even if it’s “bad news.”
- Engineers shall act for each employer or client as faithful agents or trustees.
- Communication should ensure that your employer’s/client’s best interests are served and harm is avoided.
- How do communication failures happen in engineering practice?
- Suppression of open communication
- Bad news, criticism, questions, and information outside expectations are regarded as negatives by the organization.
- Example: Leading up to Volkswagen’s emissions scandal, both its chairman and its CEO “created [a]… culture in which dissent and criticism weren’t tolerated…. ‘Workers and managers are afraid to speak the truth…. Dissenting opinions are at best ignored and at worst suppressed.’”
- Lack of critical links
- Organizational structure does not allow or facilitate communication between persons having important information and persons needing it.
- Example: 1994 Moura, Australia, mine disaster (11 deaths), where investigation found “There appeared to be no one who was a single and responsible recipient of a series of apparently disconnected but vital pieces of information.”
- Message attenuation
- Information is passed repeatedly between persons with changes each time, typically diminishing important aspects of the message.
- Example: Investigation into Space Shuttle Columbia accident found: “The Mission Management Team Chairʼs position in the hierarchy governed what information she would or would not receive. Information was lost as it traveled up the hierarchy…. The uncertainties and assumptions that signaled danger dropped out of the information chain…”
- Jargon, semantic mismatch, and other linguistic problems
- Senders and receivers of communication don’t “speak the same language.” Engineers tend to use abbreviations and precise technical terms that others don’t use.
- Example: Terms like “100 year flood” used in floodplain mapping and stormwater engineering are often misunderstood by the public. Replacing this with “special flood hazard area” does little to help.
- What can you do to prevent negative consequences of “communication failure” in engineering practice?
- Create a culture of open communication
- Don’t penalize anyone for sharing bad news or asking critical questions
- Assess what communication channels are needed and establish them
- Implement redundant links to avoid single points of failure
- Be an active listener
- Practice message verification (a.k.a., “parroting back”) to ensure that information is being passed on reliably. Upon receiving an important message, respond to the speaker, “What I think you’re saying is
____. Is that right?” When answering a question, end your answer by asking back, “Does that answer your question?”
- Encourage everyone to say “I don’t understand” when needed.
- Jargon-proof your exchanges by having listeners and readers feel comfortable asking for clarification
- Recognize communication’s overlap with issues such as culture, accountability, bias, and normalization of deviance.
Normalization of Deviance
- What is “normalization of deviance”?
- In her book, The Challenger Launch Decision, Dr. Diane Vaughan defines the “normalization of deviance” as: “The gradual process through which unacceptable practice or standards become acceptable. As the deviant behavior is repeated without catastrophic results, it becomes the social norm for the organization.”
- Why might this occur within a technical organization?
- Technical issues become accepted within the organization because they occur without consequences.
- The motivation to understand the root cause of the issue, which can incur both financial and schedule costs, wanes because it is not viewed as presenting a significant risk.
- Cultural differences may exist between engineering and operations teams and the safety organization.
- The former views anomalies in the nominal context in which they generally occur, while the latter is trained to evaluate the potential worst-case outcome of the anomaly.
- What are some examples of this occurring?
- Space Shuttle Challenger Disaster
- O-rings sealing joints in the solid rocket boosters were never intended to be damaged by high-temperature gas ejections.
- After several launches where such damage occurred without impairing operations, engineers and managers gradually began to regard this as acceptable.
- January 28, 1986, launch experienced catastrophic damage to O-rings resulting in explosion and loss of all 7 astronauts aboard.
- Crash of ValuJet Flight 592
- May 11, 1996, crash of DC-9 aircraft 10 minutes after takeoff from Miami. Death of all 110 aboard.
- Crash caused by cargo compartment fire started by chemical oxygen generators.
- Oxygen generators were improperly packed and labeled by maintenance contractor. Investigation found that confusion over complex procedures and complacence (driven by no observed problems) led to “pencil-whipping” (falsification) of records and dangerous practices.
- Space Shuttle Columbia Disaster
- Debris, including insulating foam from the external tank, was liberated during almost all of the previous 113 Space Shuttle launches.
- Over time, foam losses from the External Tank became an “accepted risk” despite a lack of any requirement for the Orbiters to be tolerant of debris impacts.
- On the morning of January 16, 2003, during the ascent phase of the STS-107 mission, insulating foam separated from the External Tank at ~81 seconds and struck the lower half of the Orbiter left wing.
- Although there was clear video and photographic evidence that debris had impacted Columbia, there was no evidence of the resulting damage to the left wing.
- Based on an erroneous computational model, NASA’s Debris Assessment Team concluded some localized heating damage would most likely occur during entry, but they could not definitively state that structural damage would result.
- During Columbia’s entry on February 1, 2003, a large breach in the left wing structure, caused by the impact with the foam during ascent, allowed superheated air to penetrate the wing.
- With a weakened wing structure, the aerodynamic forces during entry caused the Orbiter to disintegrate, resulting in the loss of the seven crewmembers.
- Relationship to Basic Engineering Ethical Principles
- Engineers shall hold paramount the safety, health, and welfare of the public.
- Normalization of deviance often allows hazardous conditions to be accepted as “allowable” or “normal”.
- Engineers shall issue public statements only in an objective and truthful manner.
- The normalization process leads to engineers losing their objectivity. Information or performance that is objectively “out-of-bounds” is ignored or treated inappropriately.
- Engineers shall perform services only in the areas of their competence.
- Normalization of deviance can appear to justify engineering work in an unfamiliar setting. A “nothing will go wrong” attitude rationalizes decisions that are not competent.
- How can you prevent normalization of deviance?
- Prevent “groupthink”; know and avoid its symptoms.
- Proactively solicit alternate positions.
- Never use past success as the sole justification to redefine acceptable system performance.
- Adopt management strategies requiring engineers to “prove systems are safe and reliable” rather than “prove they are unsafe or unreliable.”
- Employ a rigorous systems engineering process. Be extremely vigilant in monitoring system performance and trends.
- Encourage/facilitate critical peer reviews of engineering analysis and test results.
- Ensure unimpeded information flow from engineers to decision-makers.
Bias Affecting Judgment
- What is “Bias”?
- Attitudes, stereotypes, and habits of thinking that affect our understanding, actions, and decision-making in an unconscious manner.
- There are many, many, many kinds of bias: implicit, confirmation, anchoring, overconfidence, groupthink, halo effect, sunk cost, framing, intertemporal, loss aversion, and many others.
- Scientists who study biases have found that they are tied to coping mechanisms that people adopt to help deal with complex flows of information. Instead of following a complete review of all available data and a comprehensive rational decision-making process, we take “shortcuts” (a.k.a., heuristics) that are vulnerable to bias.
- What does bias look like in engineering practice?
- Implicit bias
- An engineer considers a person and/or the work and information provided by him to be unduly positive or negative based on an unrelated personal characteristic: gender, age, education, wealth, race, etc.
- Confirmation bias
- An engineer believes that a condition exists and accepts information that supports the belief but rejects disproving information.
- Anchoring bias
- The first quantity or quality that an engineer hears (even if it has no justification) becomes an assumed value, and it becomes difficult to believe that reality is far from that “anchor” value.
- Overconfidence bias
- An engineer gains some experience with a situation and then believes she understands it better than she really does. The possibility of the situation departing from past experience is downplayed.
- Groupthink bias
- An engineer learns that several others believe something and becomes hesitant to challenge that belief – if the group believes it, then it must be true.
- Framing bias
- An engineer’s tendency to answer a question is affected by how the question is framed. It’s natural to respond differently to “Should we confirm this design parameter to ensure user safety?” versus “Should we force the project to go further over-budget and behind schedule to consider a low-probability scenario?”
- Intertemporal bias
- Greater weight is given to short-term issues than long-term ones. This can apply to both possible failures and possible rewards.
- Loss aversion bias
- An engineer considers it more impactful to lose something she considers “in hand” than to gain the same net quantity from nothing. For example, believing that it’s worse to state a project will be delivered in 4 weeks and be a week late than to state that it will be delivered in 5 weeks and deliver it on time; the net result is the same, but many will prefer the latter over the former.
- Relationship to Basic Engineering Ethical Principles
- Engineers shall hold paramount the safety, health, and welfare of the public.
- People who are at the receiving end of unconscious assignment of blame do get hurt emotionally which could result in loss of trust, focus, and attentiveness, resulting in biased or rash decision-making.
- Engineers shall issue public statements only in an objective and truthful manner.
- What you say and how you say it matters. Avoid explicit and implicit biases in both written and oral communications.
- Correct misconceptions, misinformation, and misgivings in a timely manner to avoid further complications.
- Engineers shall perform services only in the areas of their competence.
- Exaggeration of self or others expertise based on background, faith, culture, and race can lead to taking engineering jobs that are not suitable – implicit bias can be at play here.
- Engineers shall uphold and enhance the honor, integrity, and dignity of the profession.
- Objective decision-making has been the hallmark of the engineering profession; we do not ignore things, we just consider the most important or relevant to create engineering designs and make decisions to reflect the risk/reward aspects.
- Can you avoid bias?
- Simple answer is “No,” but you can recognize and manage it.
- More likely to pop up in pressure situations:
- Stress and time pressure in completing tasks
- Lack of proper and timely information
- Demands on decision-making in a certain way
- Some causes of bias:
- Associations over time
- Cultural norms
- Organizational practices
- Respect diversity of thought.
- Analyze situations and data objectively.
- Remember that bias can be bad for morale and productivity.
- Small interventions can make a large difference.
- Ways to avoid negative effects of bias
- Avoid ethical lapses resulting from bias by documenting procedures that are grounded in objective evaluation based on scientific backgrounds of processes or systems.
- Accountability and responsibility
- Understand cultural norms, implications, and educate others.
- Be mindful on how to ask questions.
- Stereotypical statements such as “her achievements are exceptional for being a woman” are not helpful (speaker may assume that it is complimentary, but this falls into implicit bias).
- Don’t phrase questions to lead to desired answers (framing and confirmation bias).
- Perspective-taking is helpful – “put yourself in other people’s shoes.”
- Possible to get objective, verifiable, and quantitative data in most engineering situations.
- Use well-established engineering and scientific principles to establish facts.
Conclusion
- Impediments to Ethical Practice
- You may have noticed that as each of these 5 issues was discussed, there were frequent references to the other impediments. In reality, these impediments to ethical practice almost never happen in isolation. Organizations with poor ethical culture usually have problems with communication and accountability. Normalization of deviance and bias often reinforce each other. There’s a lot of overlap in the Venn diagram circles for these factors.
- Fortunately, this often means that the same habits of mindfulness and deliberation can address many of these impediments together. No engineer walks into the office thinking, “Today, I think I’ll be unethical,” but ethical lapses occur nonetheless. By understanding these vulnerabilities of individuals and organizations, you can ensure that your career will be one of continual ethical practice that makes life better for everyone.