Psychological Assessment – Exam Quick Notes
Psychological Assessment vs. Psychological Testing
Psychological Assessment
Integrative process: combines data from tests, interviews, case studies, observation, special apparatus.
Goal: answer referral question / solve problem / aid decision-making.
Individualized; evaluator selects tools & interprets data.
Outcome: reasoned psychological evaluation.
Psychological Testing
Measures specific variables via standardized devices/procedures.
Goal: obtain a quantitative gauge (e.g., score).
Can be individual or group; evaluator may be interchangeable (technician skills).
Outcome: numerical test score.
Forms of Assessment
Collaborative: assessor + client partner throughout process.
Therapeutic: assessment itself facilitates self-discovery.
Dynamic: evaluation → intervention → re-evaluation; focuses on learning from intervention.
Psychological Tests
Defined: devices/procedures measuring .
Key differences
Administration: one-on-one vs. large-group; examiner present vs. absent.
Scoring & interpretation: manual, self-scoring, computer; yields various score types (e.g., cut scores).
Psychometric soundness: reliability & validity; utility = practical value.
Other Core Assessment Tools
Interview
Face-to-face: note verbal & non-verbal cues, appearance.
Phone/virtual: note voice changes, pauses, emotional signs.
Motivational interviewing blends info-gathering with intervention.
Portfolio: collection of work samples demonstrating abilities.
Case History Data: archival records illuminating past & present functioning.
Behavioral Observation
Naturalistic or structured; qualitative/quantitative recording.
Role-Play Tests: simulated situations to elicit behaviors/thoughts.
Computer-Based Applications
Local, Central & Tele-processing: computerized scoring/interpretation.
Reports
Simple score report: scores only.
Extended: scores + statistics.
Interpretative: narrative explanation (descriptive, screening, consultative).
Computer-Assisted Assessment: builds & scores tests automatically.
Computerized Adaptive Testing: item selection adapts to prior responses.
Key Roles
Test Developer: designs & publishes tests.
Test User: selects, administers, scores, interprets.
Test Taker: subject of assessment (living or deceased – psychological autopsy).
Assessment Settings
Educational: achievement & diagnostic testing, informal evaluation.
Clinical: screening & diagnosis of behavioral issues.
Counseling: schools, prisons, agencies; aim = improved adjustment/productivity.
Geriatric: focus on quality-of-life variables.
Business/Military/Government: selection, credentialing, organizational decisions.
Ethical & Procedural Responsibilities
Pre-administration: choose appropriate test, secure materials, ensure qualified staff & proper environment.
During: establish rapport, follow standardized protocols.
Post-administration: protect protocols, score per criteria, communicate results clearly, document irregularities.
Assessment of People with Disabilities
Accommodation: adapt test/procedure (e.g., Braille) to maintain construct validity.
Alternate Assessment: non-standard procedure when standard testing is unsuitable.
Selection of accommodation depends on assessee capability, assessment purpose, score meaning, assessor capability.
Classification of Tests
Human Ability
Achievement: measures prior learning.
Aptitude: predicts potential to learn/acquire skill.
Intelligence: assesses general problem-solving & adaptive potential.
Individual vs. Group formats.
Personality & Behavior
Structured Personality Tests: fixed self-report items with set responses.
Projective Tests: ambiguous stimuli → spontaneous responses.
Overt Behavior Tests: observable acts (e.g., behavioral checklists).
Covert Behavior Tests: internal states (e.g., projective, neuropsychological).
Quick Reference Ranges (example)
Brief Resilience Scale total ; average score
= Low resilience
= Normal resilience
= High resilience