Chapter 2 Notes: Authentication, Access Control, and Cryptography
2.1 Authentication
Core idea: authentication is the process of proving an asserted identity; identification is the assertion of who someone is.
Two-step perspective: identification (who are you) vs authentication (proof that you are who you claim to be).
Three bases of authentication (factors):
Something you know (passwords, PINs, passphrases)
Something you are (biometrics: fingerprint, voice, face, etc.)
Something you have (tokens, badges, cards, keys)
Two- and multi-factor authentication (MFA): combining factors increases confidence; common MFA example: password (something you know) + one-time code (something you have).
Passwords and their weaknesses:
Dictionary and brute-force attacks; vulnerability to guessable passwords like “password” or “123456.”
Dictionary attacks, rainbow tables (precomputed hash values), and the use of salts to thwart rainbow tables.
Password storage typically uses a one-way concealment (hashing) rather than storing plaintext to protect against table leaks.
Salt: per-user random data added to the password before hashing to prevent identical hashes across users.
Active authentication stores and checks concealed values; if a password is forgotten, resets are used.
Attack types and defenses:
Exhaustive brute-force attack: work factor grows with key-length; longer/more complex passwords increase the attack cost.
Rainbow tables: precomputed tables of common passwords and their conceals; salts defeat rainbow tables.
Password reuse and social engineering: users should avoid predictable patterns; avoid telling others.
Password usability and management:
Usability in the small vs. usability in the large: many sites require many passwords; password managers help manage credentials securely.
Authentication variants beyond passwords:
Location as partial authenticator (e.g., card-present at a location).
Security questions: weak authenticators due to public information; not ideal for strong security.
Biometrics: something you are; advantages: cannot be easily lost or shared; drawbacks: false positives/negatives, spoofing, enrollment/template issues, sampling speed, and potential privacy concerns.
Biometrics: measurement, matching tolerance, and the notion of a template; authentication succeeds if the current sample is within tolerance of the stored template.
Biometrics in practice: accuracy depends on quality of the sample; false positives/false negatives must be balanced (ROC curves).
Biometric vulnerabilities and forgery risks: masterprints, fake fingers, iris contact lenses; combination of biometrics (e.g., image + voice) improves accuracy.
Tokens and dynamic vs. static authentication:
Static tokens (e.g., badges, key fobs) do not change; dynamic tokens (e.g., SecurID) generate a one-time value that changes regularly, improving security against interception.
Static vs. dynamic tokens and the risk of skimming and replay attacks.
Time-based tokens introduce clocks and nonce concepts to prevent replay.
Federated Identity Management and Single Sign-On (SSO):
Federated identity management: one identity across multiple systems; authentication is performed in one place and tokens propagate to services.
SSO: an umbrella authentication process that allows access to multiple applications after a single login.
Multifactors and security trade-offs:
More factors can increase security but also increase user friction; the optimal number of factors (n) depends on the risk and usability requirements.
Two-factor authentication (2FA) examples: password plus a one-time code delivered via text or email; 2FA improves security even if one factor is compromised.
Usability challenges and real-world issues:
Password fatigue and the risk of weak, reused passwords; password managers reduce risk and improve usability for many sites.
Overly complex authentication can frustrate users, while too-simple schemes can be insecure.
Authentication concepts beyond passwords:
GrIDSure and patterns as knowledge-based authentication; usability vs. security trade-offs remain under study.
Tokens and location-based checks as supplemental authenticators.
Biometric performance and reliability:
Accuracy metrics: sensitivity (true positive rate) and specificity (true negative rate); ROC curves illustrate the trade-off.
False positives/negatives have different consequences; systems may tune thresholds based on risk.
Biometric systems can fail due to enrollment issues, sensor quality, lighting/noise, or environmental factors.
Security in practice and threat awareness:
The distinction between identification (who are you) vs authentication (are you who you claim to be) is critical in system design.
Digital signatures, certificates, and trust frameworks build a bridge between identity and cryptographic proof.
2.2 Access Control
Purpose: protect general objects (files, devices, network connections) by enforcing policy that grants only allowed accesses.
Core model (Graham and Denning): A subject is permitted to access an object in a specific mode, and only such accesses are allowed.
Subjects: users or surrogate programs acting on behalf of users.
Objects: resources such as files, tables, memory, devices, or even processes.
Access modes: read, write, modify, delete, execute, create, destroy, etc.
Access policy and enforcement:
A high-level security policy governs who can access what and how; enforcement must be principled and auditable.
Check every access; enforce least privilege; verify acceptable usage (e.g., time, location, or context).
Reference Monitor concept:
A theoretical construct that is always invoked, tamper-proof, and verifiable; ensures all access controls are consistently enforced.
Representations of access control:
Directory per subject: lists objects a subject can access; easy revocation but scale and propagation issues.
Access Control List (ACL): one list per object listing subjects and rights; scalable for many objects; supports default rights; allows wildcards (e.g., *).
Access Control Matrix: a matrix with subjects as rows and objects as columns; entry is the set of rights; often sparse; can be represented as triples ⟨subject, object, rights⟩.
Typical protection structures:
Privilege lists (a.k.a. directories) for each subject.
Capability-based protection: unforgeable tokens (capabilities) tied to a subject and an object; can be transferred with controlled rights; can be revoked.
Granularity and performance:
Granularity ranges from fine-grained (per byte) to coarse-grained (entire device); finer granularity implies more checks and higher overhead.
Real systems often balance granularity with performance and manageability.
Additional concepts:
Propagation of access rights: rights granted to one subject may be passed to others; revocation must track propagation.
Wildcards in ACLs allow concise policy but require careful precedence rules (specific entries take precedence over wildcards).
Privilege management: least privilege principle; restricts access to only what is necessary.
Audit logging: record access decisions and events for forensics and policy enforcement.
Special access models:
Procedure-oriented access control: protection via trusted interfaces/procedures that mediate access to an object.
Role-Based Access Control (RBAC): privilege sets tied to roles (e.g., administrator vs. user) rather than individuals; simplifies administration.
Practicalities and trade-offs:
Implementations across OS, databases, and networks require coordination; hardware limitations can shift some control to applications or appliances.
Access control is inseparable from security policy and system design; misconfigurations can create security gaps.
2.3 Cryptography
Core notions:
Encryption is the process of encoding data to protect confidentiality; decryption recovers the original plaintext.
Cryptosystem: a set of algorithms for encryption and decryption; often uses a key K.
Plaintext P, ciphertext C; encryption E and decryption D with key(s): C = E(K, P), P = D(K, C).
Symmetric vs. public-key cryptography:
Symmetric (secret-key) encryption: a single key K is used for both encryption and decryption; fastest and used as the main workhorse; key distribution is a major challenge.
Example relation: P = D(K, E(K, P))
Asymmetric (public-key) encryption: a pair of keys (public key Kpub, private key Kpriv); one key encrypts, the other decrypts; enables secure key exchange and digital signatures.
Example: C = E(K{PUB}, P) and P = D(K{PRIV}, C); with RSA: P = RSA(RSA(P, e), d); or equivalently, P = D(K{PUB}, E(K{PRIV}, P)) in some schemes where operations commute.
Common algorithms and roles:
DES (56-bit key) and AES (128/192/256-bit keys) are symmetric ciphers; AES is now the standard for strong symmetric encryption.
DES life cycle and limitations led to updates: Double DES, Triple DES (3-key and 2-key variants).
RSA is a widely used public-key algorithm based on factoring hard problems; slower than symmetric ciphers but essential for secure key exchange and digital signatures.
Block vs. stream ciphers: stream ciphers encrypt bit/byte-by-byte; block ciphers operate on fixed-size blocks (e.g., 64, 128 bits).
Key management and distribution:
Symmetric: requires secure key distribution; number of keys grows as n(n−1)/2 for n users.
Public-key: each user has a public key and a private key; only two keys per user; public keys can be distributed openly to enable secure communications with unknown parties.
Public-key cryptography and key exchange:
Public-key cryptography enables exchange of symmetric session keys securely between parties with no prior shared secret.
MITM risks exist in naive key-exchange protocols; mitigations include using half-key exchanges and nonce challenges to prevent replay and impersonation.
Digital signatures and certificates:
Digital signature = hash of data signed with the sender’s private key; provides authenticity and integrity.
Process typically uses a hash function H(M) and asymmetric signing: Sig = E(KPRIV, H(M)); verification uses the public key: H(M) ?= D(KPUB, Sig).
Certificates bind a public key to an identity; issued by a Certificate Authority (CA) and chained in a Public Key Infrastructure (PKI).
PKI includes root CAs and intermediate CAs; trust is based on a chain of signatures from a trusted root.
Certificates and trust in networks:
Certificate chains enable cross-organization trust; root CAs can be compromised (e.g., DigiNotar 2011) with wide impact.
Root certificates and certificate revocation mechanisms are essential for maintaining trust.
Code signing and integrity:
Digital signatures are used to attest to authenticity and to ensure code integrity; often used in software distribution and secure updates.
Hash functions and integrity mechanisms:
Cryptographic hash functions produce a digest that is (ideally) unique for a given input; used for integrity checks and as part of digital signatures.
Hash collisions are possible in theory; cryptographic hash functions aim to be collision-resistant (and/or preimage-resistant) to prevent forgery.
Cryptographic checksums (hashes with a secret key) provide stronger tamper protection than plain checksums.
Attacks and defenses:
Important to consider work factor: the effort required to break an encryption; AES and RSA are designed with high work factors to deter attacks.
Attacks include man-in-the-middle (MITM) in key exchange; robust protocols and the use of certificates mitigate these risks.
Practical notes and historical context:
DES had a life cycle from 56-bit DES to 3DES and AES; AES (Rijndael) selected as successor due to stronger security and efficiency.
RSA remains slower but vital for key exchange, digital signatures, and PKI; symmetric ciphers handle bulk data efficiently.
Cryptography is often used in combination: symmetric encryption for data, public-key cryptography for key exchange, and digital signatures for authenticity and integrity.
Summary of cryptographic tools (quick reference):
Secret key encryption: confidentiality/integrity; key management is the main challenge. C = E(K, P)
Public key encryption: secure key distribution; supports digital signatures. C = E(K{PUB}, P); P = D(K{PRIV}, C)
Hash functions / message digests: integrity; non-reversible; used in signatures. H(M)
Digital signatures: authenticity and non-repudiation. Sig = E(K_{PRIV}, H(M)); verify with public key.
Certificates and PKI: trust bindings between identities and public keys; chains of trust across organizations.
Key exchange protocols: mechanisms to establish shared secrets securely (mitigate MITM with nonces, pre-shared trust, etc.).
2.4 Conclusion
The three core tools of computer security are interdependent:
Authentication identifies who you are, using knowledge, biometrics, or tokens (often in combination).
Access control enforces what authenticated subjects can do with which objects, guided by policy and implemented by the operating system, databases, and networks.
Cryptography protects data confidentiality, integrity, and authenticity, enabling secure storage and communication.
Practical security design requires balancing usability and security, selecting appropriate authentication factors, appropriate access-control granularity, and robust cryptographic techniques.
In the next chapter, these three tools are applied together to address security issues in programs, operating systems, networks, and services.
Here are the answers to your questions, based on the provided notes on authentication, access control, and cryptography:
Define confidentiality and explain why it matters.
Confidentiality refers to the protection of data from unauthorized disclosure. The notes define encryption as a process of "encoding data to protect confidentiality," thereby implying that confidentiality is the characteristic of ensuring that information is accessed only by authorized parties. It matters because, without it, sensitive information—ranging from personal data to organizational secrets and national intelligence—can be exposed, leading to privacy breaches, financial loss, reputational damage, and even threats to national security.Identify where and how data must be protected during storage, processing, and transmission.
Data must be protected across its entire lifecycle:During Storage: Data is protected in files, databases, and memory. This is achieved through Cryptography, such as "one-way concealment (hashing)" for password storage and general "encryption… enabling secure storage." Access controls (e.g., Access Control Lists, Role-Based Access Control) are crucial to ensure that only authorized subjects can access stored objects like files or database entries.
During Processing: Data is protected within applications and system processes. Access control mechanisms are used, guided by the "least privilege principle," which restricts access to only what is necessary. A "Reference Monitor" concept ensures "all access controls are consistently enforced" during interactions with objects like memory or processes. "Procedure-oriented access control" mediates access through trusted interfaces.
During Transmission: Data is protected over network connections. Cryptography is the primary tool, with "symmetric encryption" (e.g., AES) used for efficient bulk data protection, and "public-key encryption" (e.g., RSA) used for secure key exchange to establish these symmetric keys. Robust protocols and techniques like nonces are employed to mitigate risks such as "Man-in-the-Middle (MITM)" attacks during key exchange.
Understand and describe controls used to maintain confidentiality, such as encryption and access controls.
Encryption: The notes describe encryption as the "process of encoding data to protect confidentiality." It involves algorithms (cryptosystems) and keys to transform plaintext into ciphertext. There are two main types:
Symmetric (Secret-Key) Encryption: Uses a "single key K" for both encryption and decryption (e.g., DES, AES). It is fast and efficient for large amounts of data, though "key distribution is a major challenge."
Asymmetric (Public-Key) Encryption: Uses a "pair of keys (public key Kpub, private key Kpriv)" where one encrypts and the other decrypts (e.g., RSA). It is slower but crucial for secure "key exchange and digital signatures."
Additionally, "hash functions," while not directly encrypting, provide integrity checks which are complementary to confidentiality by detecting unauthorized modifications.
Access Controls: These mechanisms enforce "policy that grants only allowed accesses" to objects (files, devices, memory) by subjects (users, programs). Key aspects include:
Reference Monitor: A conceptual, always-invoked, tamper-proof component that ensures consistent enforcement of access policies.
Representations: This can be done via "Access Control Lists (ACL)" (lists per object, defining who can access and how), "Access Control Matrix" (a conceptual matrix of subjects vs. objects with rights), or "Role-Based Access Control (RBAC)" where permissions are granted to roles rather than individual users, simplifying management.
Principles: Adheres to the "least privilege principle" and requires constant "audit logging" for accountability.
Recognize attacks that target confidentiality, including interception and social engineering.
Attacks targeting confidentiality aim to gain unauthorized access to information:Interception: This involves capturing data as it is communicated or stored. The notes mention "Man-in-the-Middle (MITM)" attacks as a risk in key-exchange protocols, where an attacker intercepts and potentially modifies communication. "Static vs. dynamic tokens and the risk of skimming and replay attacks" also highlight interception, as static tokens can be easily copied (skimmed) or their captured values reused (replayed).
Social Engineering: This involves manipulating individuals into divulging confidential information or granting access to systems. The notes explicitly state "Password reuse and social engineering: users should avoid predictable patterns; avoid telling others." Additionally, "security questions" are deemed "weak authenticators due to public information," making them susceptible to social engineering tactics.
Other attacks, such as "dictionary and brute-force attacks" or the use of "rainbow tables" against passwords, while primarily targeting authentication, can directly lead to a loss of confidentiality if successful, by granting an attacker unauthorized access to protected systems and data.
Explain the consequences of losing confidentiality from individual privacy breaches to national security threats.
While the notes extensively detail the mechanisms for protecting confidentiality, they less explicitly list specific consequences. However, the robust and multi-layered security measures described across authentication, access control, and cryptography implicitly underscore the severe impact of confidentiality loss. Such consequences can range from:Individual Privacy Breaches: Exposure of personal identifiable information (PII) leading to identity theft, blackmail, or other personal harm.
Financial Loss: Compromise of banking details, credit card numbers, or proprietary business strategies, resulting in direct monetary theft or competitive disadvantage.
Reputational Damage: For individuals, organizations, or even governments, a loss of trust following data breaches.
Legal and Regulatory Penalties: Non-compliance with data protection laws (e.g., GDPR, HIPAA) can result in significant fines.
National Security Threats: Compromise of classified government information, military intelligence, or critical infrastructure control systems, which could endanger national defense and public safety.