1/276
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Evaluation
assess a process or program to provide evidence and feedback for the program.
Research
is an organized process using the scientific method for investigating problems. Can be conducted with the intent to generalize findings from a sample to a larger population. Does not always aim for, or achieve, evaluative conclusions, and it is restricted to empirical (rather than evaluative) data. Bases observed, measured, or calculated conclusions on that data.
Reliability
the consistency, dependability, and stability of the measurement process.
Validity
the degree to which a test or assessment measures what it is intended to measure.
Variables
operational forms of a construct. Designate how the construct will be measured in designated scenarios.
Formative Evaluation
looks at an ongoing process of evaluation from planning through implementation. Identifying and assessing the strengths and weaknesses of the way a health educator implements a program. Allows for continual assessment; allows for monitoring progress, troubleshooting, and corrective actions.
Process Evaluation
any combination of measures that occur as a program is implemented to assure or improve the quality of performance or delivery
Summative Evaluation
often associated with measures of judgments that enable the investigator to draw conclusions. It is also commonly associated with impact and outcome evaluations. Focuses on the outcomes or products
Impact Evaluations
focuses on immediate and observable effects of a program leading to the desired outcomes.
Outcome Evaluation
focused on the ultimate goal, product or policy. Often measured in terms of morbidity and mortality.
Purpose Statement
identifies in detail what the health education specialist wants to learn over the course of an evaluation or research project. Usually a sentence or two written with specificity and detail. Helps to focus and guide efforts involved with data collection and analysis.
Evaluation Questions
specifically developed questions. Help to establish boundaries for the evaluation by stating what aspects of the program will be addressed. Creating encourages stakeholders to reveal what they believe the evaluation should answer. Use to monitor and measure processes, activities, outputs and expected outcomes.
Search Strategies
typically require health education specialists to: identify key search terms identify a period of time to conduct the search characteristics of the target population health conditions of interest.
Systematic Reviews
a published qualitative review of a comprehensive synthesis of publications on particular topics.
Meta-analyses
a systematic method of evaluating statistical data based on results of several independent studies of the same problem.
Pooled analyses
a method for collecting all the individual data from a group of studies, combining them into one large set of data, and then analyzing the data as it came from one big study.
Quantitative Methodology
focuses on quantifying, or measuring, things related to health education programs through the use of numerical data to help describe, explain, or predict phenomena.
Qualitative Methodology
descriptive in nature and attempts to discover meaning or interpret why phenomena are occurring.
Mixed Methods Approach
data collection to "tell the story" and describe classifications, as well as to indicate why a phenomenon is occurring within a population
Health and Psychosocial Instruments (HaPI) database
help health education specialists identify useful existing data collection instruments. Database collects rating scales, questionnaires, checklists, tests, interview schedules, and coding schemes/manuals for health and social sciences. Health and psychosocial instruments in this database are used and/or published in literature and often recognize reliability and validity concerns. Used for assessment and/or evaluation purposes.
Logic Model
used in evaluation to assist in describing key aspects of programs in terms of a simple flow chart.
Inputs
resources, contributions, and other investments that go into a program. Human, fiscal, physical, and intellectual resources needed to address the objectives of a program.
Outputs
the activities, services, and products that will reach the participants of a program. Activities, products and services that will influence short-term outcomes.
Outcomes
are often depicted as short-term, intermediate, or long-term.
Short-term Outcomes
often described as quantifiable changes in knowledge, skills or access to resources that happen if planned activities are successfully carried out. Changes in knowledge or skills among participants of the program.
Intermediate Outcomes
measured in terms of changes in behaviors that result from achievement of the short-term outcomes. Changes in behaviors or policy.
Long-term Outcomes
measured in terms of fundamental changes in conditions leading to morbidity or mortality. Changes in morbidity or mortality.
Data Analysis Plan
begin with the planning of a program. Determines if outcomes were different than expected. Goal is to reduce, synthesize, organize, and summarize information to make sense of it.
Quantitative
closed-ended items - respondents make selections that represent their knowledge, attitude or self-reported behavior from predetermined lists, scales or categories. Participants choose a response predetermined by the researcher; they may be multiple choice, categorical, Likert-scale, ordinal or numerical. Lend themselves more readily to mathematical operations and advanced statistical analysis.
Qualitative
open-ended items solicit written or verbal responses to items that cannot be adequately answered with a single word or phrase. Participants offer in their own words and provide descriptive information. Enables the researcher to describe the phenomena of interest in great detail and in the original language of the research participants.
Content validity
(face) considers the instrument's items of measurement for the relevant areas of interest.
Criterion Validity
refers to one measure's correlation to another measure of a variable.
Construct Validity
ensures that the concepts of an instrument relate to the concepts of a particular theory.
Reliability
assess whether the instrument is measuring concepts consistently.
Internal Consistency
considers intercorrelations among items within an instrument
Test-Retest Reliability
considers evidence of stability over time.
Rater Reliability
considers differences among scorers of items and control for variation due to error introduced by rater perceptions.
Attainment
focused on program objectives and the program goals, serve as standards for evaluation.
Decision-Making
based on four components designed to provide the user with the context, input, processes and products with which to make decisions.
Goal-Free
not based on goals; evaluator searches for all outcomes including unintended positive and negative side effects.
Naturalistic
focused on qualitative data and uses responsive information from participants in a program; most concerned with narrative explaining "why" behavior did or did not change.
Systems Analysis
based on efficiency that uses cost-benefits or cost-effectiveness analysis to quantify effects of a program.
Ultilization-Focused
done for and with a specific population.
Evaluation Model
Attainment, Decision-making, Goal-free, Naturalistic, Systems analysis, Utilization-focused.
Evaluation Frameworks
developed to summarize and organize the essential elements of a program evaluation. Provide a platform to perform and monitor evaluations.
CDC Six-Step Framework
developed to help guide program evaluation. Steps in Evaluation Practice: Engage Stakeholders, Describe the program, Focus the evaluation design, Gather credible evidence, Justify conclusions, Ensure use and share lessons learned. Standards for Effective Evaluation: Utility, Feasibility, Propriety, Accuracy
Utility
Serve the information needs of intended users.
Feasibility
be realistic, prudent, diplomatic, and frugal.
Propriety
Behave legally, ethically, and with due regard for the welfare of those involved and those affected.
Accuracy
Reveal and convey technically accurate information.
Experimental Designs
consist of some form of controlled trial
Randomized Controlled Trial
all clusters or participants in the experiment have an equal chance of being allocated to each group of study.
Quasi-Randomized Studies
allocate participation in a study based on some scheme, such as an assigned number -- odd or even.
Non-Randomized Studies
do not use random allocation of participation and groups or individuals are assigned arbitrarily. Quasi-experimental studies.
Descriptive Studies
Cross-sectional - describe the occurrence of disease and disability in terms of person, place, and time using prevalence surverys, surveillance data, and other routinely collected data to describe the phenomena. DESCRIBES, MORE EXPLANATORY, PROFILES CHARACTERISTICS OF GROUP, FOCUSES ON WHAT, ASSUMES NO HYPOTHESIS, REQUIRES NO COMPARISON GROUP
Analytic Design
explain etiology and causal associations. Cohort or case control. Aim to estimate the strength of a relationship between an exposure and an outcome. EXPLAINS, MORE EXPLORATORY, ANALYZES WHY A GROUP HAS CHARACTERISTICS, FOCUSES ON WHY, ASSUMES A HYPOTHESIS, REQUIRES A COMPARISON GROUP. Aim to estimate strength of relationship between exposure and outcome.
descriptive analysis
exploratory in nature and designed to describe phenomena specific to a population using descriptive statistics such as raw numbers, percentages, and ratios.
Descriptive Statistics
describe what the data reveals. Provide simple summaries about the samples' measures.
Continuous Data
have the potential for infinite values for variables.
Discrete Data
are limited to a specific number of values to represent variables.
Nominal Scores
cannot be ordered hierarchically but are mutually exclusive (male and female).
Ordinal Scores
do not have a common unit of measurement between them but are hierarchical.
Interval Scores
have common units of measurement between scores but no true zero.
Ratio Scores
represent data with common measurements between each score and a true zero.
Inferential Statistics
are used when the researcher or evaluator wishes to draw conclusions about a population from a sample. Involves mean, median, and mode.
Probability Sample
random sample. drawn when observations and measurements from the total population would be too costly, not feasible, or unnecessary. Each person in a population of interest has an equal likelihood of selection. NO BIAS, any variation is only a matter of chance. The larger the sample, the more representative it is considered.
Stratified Sample
divides a population into segments based on characteristics of importance for the research. Gender, age, social class, education level, and religion.
Non-probability Samples
not as representative and are less desirable than probability samples.
Quota Sampling
setting the proportion of strata within the sample.
Convenience Samples
accidental; however, they are not random. Volunteers would qualify.
Qualitative Approaches
observation/audit; participant observation; document study; interviews; and focus groups. Helps the evaluator or researcher become more experienced with the variables or phenomenon of interest.
Steps involved in qualitative data analysis
Data reduction
Data Display
Conclusion drawing and verification
Data Reduction
selecting, focusing, condensing, and transforming data. The process should be guided by thinking about which data best answers the evaluation questions.
Data Display
creating an organized, compressed way of arranging data. Helps facilitate identifying themes, patterns, and connections that help answer evaluation questions. Usually involves coding, or marking passages in text that have the same message or are connected in some way. An accompanying explanation of what the selected passages have in common is created.
Conclusion Drawing and Verification
the data is revisited multiple times to verify, test, or confirm the themes and patterns identified.
Examine Qualitative Data to Identify
patterns, recurring themes, similarities, and differences; ways in which patterns help answer evaluation questions; deviations from patterns and possible explanations for divergence; interesting or particularly insightful stories; specific language people use to describe phenomena; to what extent patterns are supported by past studies or other evaluations; and to what extent patterns suggest that additional data needs to be collected.
IRB
functions to protect human subjects involved in research. Referred to as an independent ethics committee or a committee that has been formally designated to approve, monitor and review biomedical and behavioral research involving humans.
HIPPA
"Privacy Rule" - establishes conditions when protected health information may be used for research or program evaluation. Investigators are permitted information for research with individual authorization, or for limited circumstances without individual authorization.
Five Elements that are critical for ensuring use of an evaluation
design, preparation, feedback, follow-up, and dissemination.
Confounding Variables
are extraneous variables outside the scope of the intervention that can impact the results. Variables that affected results that were not accounted for in the study design.
research errors
sampling errors, lack of precision, and variability in measurement.
systematic errors
selection bias, instrumentation bias, and other internal threats to validity.
Dissemination
the process of communicating procedures, findings or the lessons learned from an evaluation to relevant audiences in a timely, unbiased, and consistent fashion. Goal is to achieve full disclosure and impartial reporting.
Detailed Documentation
First Part = an introduction (front matter and the executive summary) Second Part = literature review Methodology Section (data analysis plan is often described within) Results Section Final Portion = conclusions, recommendations, or a summary. MOST LIKELY READ BY STAKEHOLDERS
Policy analysis
the use of any evaluative research to improve or legitimate the practical implications of a policy-oriented program.Carried out when there is still a chance that the policy can be revised.
Health Impact Assessments (HIAs)
used to objectively evaluate the potential health effects of a project or policy before it is developed or implemented. Can provide recommendations to increase positive health outcomes and minimize adverse health outcomes.
Code of Ethics
framework of shared values of the profession that help guide the behaviors of a health education specialist.
Consultation
the process by which the knowledge of one person is used to help another make better decisions
Informal Consulting
does not require a written agreement or formal contract. Consists of acting as a resource person responsible for organizing health education materials and responding to requests for health education information and literature/materials.
Formal Consulting
requires a contract or written agreement between two parties, the client and consultant. Hired for his/her expertise in a particular area for which the client needs assistance, advice, direction, etc. Follows the steps of diagnosis, recommendation, action, evaluation, and termination. Requires the health education specialist to provide technical expertise, current theory, and specialized knowledge.
Evidence-Based
refers to program or strategies that have been evaluated and are found to be effective
Health Literacy
the extent to which individuals have the ability to obtain, process, and understand basic health information and care services to make appropriate health decisions.
Health Numeracy
the ability to understand numbers which affect individuals' health care decisions and behaviors.
Primary Data Source
publications of descriptions of research studies or data written by the individual who participated in or conducted the studies.
Secondary Data Source
publications of research studies or data written by an individual who did NOT participate in those studies or data collection.
Tertiary Data Source
publications such as encyclopedias or other compendia that sum up secondary and primary sources. Includes reference tools such as pamphlets or fact sheets.
U.S. Census
offers quality data about the people and economy in the U.S. Primary source for population and health statistics
National Center for Health Statistics (NCHS)
a rich source of information about the health status of the population and monitors trends in health status and health care delivery.
World Health Organization
Located in Geneva, Switzerland. The most recognized international health organization, and provides a variety of health information and data on their website.
Voluntary Health Agencies
organizations that deal with health needs and may rely heavily on donations or volunteers to function.