1/151
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No study sessions yet.
0-day (Zero-day)
We treated the issue like a 0-day until a vendor fix was available
0wned / Pwned
After the credential leak, several accounts were reported as pwned
2FA Fatigue / MFA Fatigue
MFA fatigue attacks are harder when users verify prompts and use number matching
A
Account Takeover (ATO)
We reset sessions to contain a suspected account takeover
Allowlist
We allowlisted only the build server IPs for the admin portal
APT (Advanced Persistent Threat)
The report described the campaign as an APT-style intrusion
Attack Graph
The attack graph showed a path from a low-privilege VM to the database
Attack Surface
Disabling unused endpoints reduced our attack surface
Attack Vector
The likely attack vector was a stolen password reused across services
Beaconing
We flagged the host for regular beaconing to an unfamiliar domain
BEC (Business Email Compromise)
Finance used a call-back process to prevent BEC payment fraud
Bikeshedding
We stopped bikeshedding about button colors and focused on the security bug
Black Hat
The conference talk contrasted black hat tactics with defensive testing
Blacklist / Blocklist
We blocklisted the malicious domain across DNS and proxy controls
Blast Radius
Segmenting the network limited the blast radius of the incident
Blue Team
The blue team wrote a new detection rule based on the incident
Bohrbug
This crash was a Bohrbug
Bot Mitigation
Rate limits and CAPTCHAs are part of our bot mitigation strategy
Botnet
The provider warned that a botnet was driving the traffic spike
Brute Force
We used a brute force search as a baseline before optimizing
Bug Bounty Hunter
A bug bounty hunter responsibly disclosed the issue to our security team
Butt-Driven Development (DDD)
He used Butt-Driven Development to confirm the code path was running
C
C2 / C&C (Command and Control)
EDR flagged traffic consistent with command-and-control behavior
Callback
The malware tried a callback every five minutes
Canary
Our canary token alerted us that the file share was being probed
Cargo Cult Programming
We removed the cargo cult retry loop and added real error handling
CSPM
CSPM reported several storage buckets were publicly accessible
CTF (Capture the Flag)
We used a CTF exercise to practice incident triage
CVE
The patch note referenced a CVE affecting the library
CVSS
We prioritized fixes using CVSS plus our own threat model
CWPP
CWPP helped us spot a suspicious process in a container
D
DAST
DAST caught an auth bypass that static analysis missed
DDoS
The site slowed during a DDoS, but the CDN absorbed most of it
Defense in Depth
Defense in depth meant the stolen password still needed MFA to work
Detections-as-Code
We reviewed detections-as-code changes in pull requests
DFIR
DFIR collected memory images for deeper analysis
DMZ
Public web servers live in the DMZ, not on the internal network
Drift
Config drift caused the firewall rule to reappear after a reboot
Dropper
The dropper was blocked before it could fetch the next stage
Duck Debugging (Rubberducking)
While rubberducking, she noticed the variable was never initialized
E
East-West Traffic
Microsegmentation helped control east-west traffic
Edge Case
The bug only appeared on an edge case input with an empty list
EDR / XDR
EDR quarantined the host after detecting a suspicious binary
Eradication
After containment, eradication focused on removing persistence
Exfiltration (Exfil)
DLP alerts suggested possible exfiltration to a personal email
Exploit
The vendor said no exploit was seen in the wild yet
Exploit Chain
The incident used an exploit chain combining misconfig and a known CVE
F
Fail Closed
We chose fail-closed behavior for authorization checks
Fail Open
The service failed open during an outage, which was a serious flaw
False Negative
We added more telemetry to reduce false negatives in detections
False Positive
We tuned the rule to cut down on false positives
Flaky Test
We quarantined the flaky test until it was fixed
Footgun
Default admin access is a footgun for new deployments
G
God Object
The API client became a god object with hundreds of methods
Gray Hat
The discussion explained why gray hat testing can still cause harm
H
Hands-on-Keyboard
The logs showed hands-on-keyboard commands after the initial alert
Happy Path
The demo worked on the happy path but failed on bad inputs
Hardcoding
We removed hardcoded secrets and loaded them from a secrets manager
Hardening
OS hardening disabled unused services and weak ciphers
Heisenbug
Adding debug prints made the Heisenbug disappear
Honeypot
The honeypot recorded brute-force attempts on SSH
Honeytoken
We planted a honeytoken in the folder to detect unauthorized access
Hooker Code
That unhandled callback was hooker code that crashed the service
I
IaC (Infrastructure as Code)
IaC made it easy to review firewall changes
IAM
We fixed the issue by tightening IAM permissions
IdP (Identity Provider)
SSO was down because the IdP had an outage
Indicators of Attack (IOAs)
An IOA was repeated failed logins followed by a successful one
Indicators of Compromise (IOCs)
We searched logs for the published IOCs
Initial Access
Phishing was the suspected initial access method
Insider Threat
The policy addresses insider threat and accidental data leaks
IR (Incident Response)
During IR we focused on containment before cleanup
J
Jenga Code
No one touched the ancient module because it was Jenga code
K
Kill Chain
We mapped our detections to the kill chain stages
L
Lasagna Code
The service had lasagna code with too many wrappers
Lateral Movement
Network logs suggested lateral movement via remote admin tools
Least Privilege
Least privilege prevented the compromised account from accessing billing
Living off the Land (LotL / LOLBins)
The attacker lived off the land using native admin utilities
Loader
The loader was blocked when it tried to download its next component
M
Magic Number / Magic String
We replaced the magic number with a named constant
Malware
Our scanner detected malware in the email attachment
Misconfiguration
The data exposure was caused by a storage misconfiguration
MITRE ATT&CK
We mapped the incident to MITRE ATT&CK techniques
N
N-day
The breach exploited an N-day because the server was not patched
Noise
We tuned the dashboard to reduce noise
North-South Traffic
The proxy logs showed unusual north-south traffic at midnight
O
OpSec
Good OpSec includes rotating credentials and limiting exposed metadata
OSINT
OSINT helped us find the leaked key in a public repository
Over-permissioned
The service account was over-permissioned and needed cleanup
Owned
The admin laptop was likely owned based on the forensic results
P
Packers
The binary looked packed, so we ran it in a safe sandbox
Patch Tuesday
We schedule maintenance after Patch Tuesday to apply updates
Payload
The payload attempted to encrypt local files, so we isolated the host
Pentester
The pentester reported a privilege issue with clear remediation steps
Persistence
We looked for persistence mechanisms in startup tasks and services
Phishing
The training taught employees how to spot phishing emails
Pivot
The attacker tried to pivot from a dev VM into production
PoC (Proof of Concept)
We validated the report with a PoC in a test environment
Postmortem / RCA
The postmortem identified missing monitoring as a key factor
Pretexting
The caller used pretexting to impersonate IT support
Privilege Escalation (PrivEsc)
Patching the kernel closed the privilege escalation path