1/32
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
purpose of a criterion-referenced assessment
to evaluate proficiency, functionality, or mastery of a skill
performance standard
score interpretations are based on a comparison to a ___ ___ NOT a normed sample group
scores often given as percentages, pass/fail or mastery levels
fixed standard
uses a ___ ____ rather than a percentile ranking
raw scores
greater emphasis on ___ _____
diagnosis; impairment
Not a stand-alone means to determine a _____ or _____
should be viewed as a compliment to other assessment tools
can/cannot do
how the compare to others
criterion-referenced tests tell you specifically what a client ____ or ____ ___, while a norm-referenced test tells you ____ ____ ____ ___ ____
when to use a criterion-referenced assessment
norm-referenced test is not available/appropriate to gather the desired info
client is not able to participate in standardized, normative testing
more info is needed to support or disprove a diagnosis
monitoring progress over time (baselining)
to assess more subjective functional communication skills
other diagnostic results do not correlate or explain presenting problems, and qualitative description is needed
to identify specific skill gaps
to obtain more individualized, useful info for treatment planning
foundation for criterion referenced data gathering
based on observation
based on report (client, parent, caregiver, teacher)
based on elicitation of targeted behavior
benefits of criterion-referenced assessment
more flexible administration
allows in depth look at specific communication behaviors
optimal vs typical performance
individualized the assessment for the client
skills being assessed are uniquely relevant to the client
allows observation of skills across a variety of contexts and/or
allows for a more reliable sample of skill
can test below basal and the ceiling (or age range)
can analyze missed items by considering whether influence from cultural or linguistic differences could have impacted the response
challenges to criterion-referenced assessment
does not allow for comparing the performance of clients in a particular location with national norms
can be time-consuming and complex to develop
can cost more money, time, and effort
criterion-based assessment models
observation and functional communication data
skill probes and rubrics
patient reported outcome measures (PROMs)
developmental milestone checklists
performance inventories
language sampling with performance analysis
curriculum-based analysis (CBA)
where to get criterion-referenced assessments
commercial products
published measures
clinician created
less formal
when conducting, criterion-referenced testing feels ____ ____ than a standardized, normed test
the absence of teaching
criterion-referenced testing can often feel like a therapy session. What is the main difference between therapy and criterion-assessment?
steps to develop CRA
identify question to be answered
select stimulus items
identify desired responses to stimulus items
formulate instructions for task
develop decision-making guidelines
administer informal procedure
evaluate effectiveness
revise if needed and re-administer
save procedure for later use
normative sample
PROMs assess a client’s personal experiences, perceptions, and functional abilities rather than comparing their performance to a _____ ______
PROM criteria obtained for interpretation
each PROM has specific scoring criteria to evaluate symptoms, participation, or quality of life based on a predetermined scale (e.g. severity ratings, likert scales)
focus of PROMs
assess functional change over time
measure a client’s progress by comparing their current status to their baseline, rather than against population norms
screenings
typically a form of criterion-referenced assessment
any area/skill within SLP scope of practice can be screened
time and cost efficient assessment
decreases unwanted referrals
can be used in a full evaluation for certain domains
SLPs can design, conduct, and interpret
SLP roles in screening
selecting and using appropriate screening instruments
developing screening procedures and tools informed by current evidence
coordinating and conducting screenings across different settings
participating in team meetings to review data and recommend interventions that align with federal and state regulations
reviewing and analyzing relevant records (e.g. medical, educational)
interpreting screening results and making appropriate referrals
consulting with other professionals on outcomes
articulation and phonological processes screening
purpose is to quickly identify individuals who may have a communicative disorder related to their speech sound system
not an in-depth assessment
commercially available tools or self-made
stimulability screenings
purpose is to assess a client’s ability to produce a correct production of an erred phoneme
provides prognostic information
orofacial exam/screening
purpose is to identify or rule out structural or functional factors contributing to communication difficulties
observe at rest
assess face and jaw
examine lips
assess tongue
check hard and soft palate
diadochokinetic rates (DDK)
additional checks (swallowing, voice, resonance)
Assessing DDK rates
instruct client to repeat target syllable
model sequence
say “go” and start stopwatch
say “stop and stop the stopwatch after 20 reps
redo the sequence if client stops/slows down
after assessing syllables independently, evaluate client for 10 reps of “puh-tuh-kuh”
record findings on worksheet
voice screening
can be accomplished with a few quick and easy tasks
imitate words and phrases
count to 20
read a short passage
talk conversational
prolong vowels
produce sustained /s/ and/z/ (production time should be the same)
make note of any potential concerns related to pitch, loudness, respiration, endurance, tone, resonance
hearing screening
should be included in all speech assessments
can use audiometer or informally observe client’s responsiveness to voice and sounds at varied intensity and distances
importance:
rule out hearing loss as a contributing factor
ensure valid interpretation of assessment results
supports interdisciplinary care
required by best practice guidelines
helps ensure that SLP services are targeted and accessible
CRA data collection
any behavior can be tracked or scored
many systematic ways to chart skills and collect data during assessment
important to determine the most appropriate way to gather data for your target skill
help ensure consistency, accuracy, and usefulness for tracking a client’s skill proficiency
trial-by-trial data collection
each client response to a stimulus is recorded during the session
Ex: 10 opportunities for /k/ initial words → document correct/incorrect for each
very detailed, easy to calculate % accuracy
time-consuming; harder during conversational tasks; doesn’t capture details about types of errors
rubric-based scoring
a scoring range that is numeric
used with distortion sound errors and skills that are not easily scored right vs wrong (expressive language)
Ex: Vocalic /r/ productions are rated from 0-2 for each production
captures more nuance than correct/incorrect
requires some calibration/understanding of the qualitative descriptions to be consistent; higher variable of subjectivity
tally or checkmark system
tally marks for each time a target behavior occurs and not tracking when a behavior does not occur
Ex: tally mark for each time a client produces a 2-word phrase
quick and efficient
doesn’t capture details about number of attempts or opportunities
probe data vs teaching data
probe data: collected without support/cues (measures true independent performance)
teaching data: collected while modeling, prompting, or cueing
Ex: present progressive verb conjugation is sampled without any assistance, then assessed with varying level of cues
differentiates learning vs mastery
needs clear planning to separate trials
interval recording
collect data during specific time intervals or segments instead of every trial
Ex: record every 5th response in a 20-minute session
efficient, less disruptive during natural conversation
may miss variability across the session
often done in schools
must be reflected in the documentation
rating scales
qualitative scales for effort, independence, or level of cueing; likert scales
Ex: described an object satisfactorily/deficient/excellent
captures functional performance, especially for language
less quantitative than % correct, more subjective