1/33
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
CIA Triad
Confidentiality
Data Confidentiality
Privacy
Integrity
Data Integrity
System Integrity
Availability - Timely Services to Authorised Individuals
Extended CIA Triad
Confidentiality
Integrity
Availability
Authenticity
Accountability
Non-repudiation
Cannot deny a previous commitment (like a contract). Can be considered part of integrity.
Authenticity
Messages are valid and trusted
Accountability
Actions can be traced
Threat
Circumstance that can negatively affect an organisation/user, e.g.:
Unauthorised access
Unintended data disclosure/manipulation
Denial of Service (DoS)
Threat model
Collection of threats deemed important, which dictates a set of security requirements.
Asset
Valued resources in a system.
Can be system resources (hardware, software, data or network infrastructure) or human resources (trust, time, confidence).
Risk
A measure of extent to which an asset is threatened. Typically a function of the impact of the threat and its likelihood.
Adversary
An entity trying to circumvent security infrastructure.
Vulnerability
A system artefact that exposes user, data or system to a threat.
System Outcomes of a Vulnerability
Corrupted - Incorrect response or behaviour
Leaky - Information disclosed to unauthorised individuals
Unavailable - Fails to respond quickly or at all
Sources of vulnerabilities
Flaws in software/hardware
Flaws in design and requirements
Flawed policies or misconfigurations
System misuse
Types of vulnerabilities
Technological (weaknesses in protocol, OS, network equipment)
Configuration (User accounts, misconfigured/default internet or network equipment)
Security Policy (lack of a written policy, lack of authentication continuity, unapplied access controls, no recovery plan)
Countermeasure
A security control method used by asset owners to protect resources, reduce the likelihood of a threat, and reduce the consequences of the threat.
Security Policy
Set of criteria to provide to security services. It defines what the services should provide and enforce, and how they are implemented.
Participant
An expected system entity.
Includes hardware, agents (software), people, enterprises.
All parties need to be trusted.
Trust
The degree to which an entity/participant has freedom to behave in the system.
Permissions and obligations, described using a trust model.
Trust Model
Model to describe which participant is trusted for what actions in a certain environment.
Trust Boundary
A point in a system where the level of trust changes
Attack
Process to realise a threat. Can be passive or active, and originate from inside or outside.
Passive Attack
Attempting to learn or use information that doesn’t affect system resources, e.g. eavesdropping
Active Attack
Attempting to alter system resources or affect system operation, e.g. password guessing
Security Perimeter
The domain for which an organisation has administrative control
Attack Surface
Set of reachable and exploitable vulnerabilities of a system, e.g. open ports or employees.
Attack vector
The specific means by which an attack is enacted, e.g. key logger
Threat Consequences
Unauthorised Disclosure (vs Confidentiality)
Deception (vs Integrity)
Disruption (vs Availability)
Usurpation (vs system Integrity)
Unauthorised Disclosure (threat consequence)
Exposure, interception, inference or intrusion of sensitive information
Deception (threat consequence)
Masquerading as authorised entity, falsification of data, or repudiation
Disruption (threat consequence)
Incapacitation, corruption, or obstruction of a system or its resources/messages.
Usurpation (threat consequence)
Misappropriation of a service or unauthorised/misuse of a system.
Security Design Principles
Widely-regarded ideas that inform the design of security mechanisms
Saltzer and Schroeder Principles
Access Control:
Fail-safe defaults - zero-trust
Complete Mediation - every access is checked
Separation of Privilege - divide access rights among entities
Least Privilege
Other:
Economy of Mechanism - least code has least flaws
Open Design - allow scrutiny from experts
Least Common Mechanism - don’t share functions too much
Psychological Acceptance - security measures are transparent and user-friendly
Important Other Security Principles
Isolation - restrict critical resources, isolate files and processes
Modularity
Layering - defence in depth
Minimized trust surface - Users and components have zero trust between each other