Chapter 4- Evaluation and Research

0.0(0)
studied byStudied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/57

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 8:02 PM on 1/30/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

58 Terms

1
New cards

Delimitation

are decisions made by an evaluator or researcher that ought to be mentioned because they are used to help the evaluator identify the parameters and boundaries set for a study. Examples of delimitations include why some literature is not reviewed, populations are not studies, and certain methods are not used.

2
New cards

Evaluation

is a series of steps that evaluators use to assess a process or program to provide evidence and feedback about the program

3
New cards

Limitations

are phenomena the evaluator or researcher cannot control that place restrictions methods and, ultimately, conclusions. Ex. of possible limitations might be time, nature of data collection, instrument, samples and analysis

4
New cards

Logic Model

take a variety of forms but generally depict aspects of a program such as inputs, outputs, and outcomes. Offer a scaled down, somewhat linear, visual depiction of programs.

5
New cards

Research

is an organized process in which a researcher uses the scientific method to generate new knowledge

6
New cards

Reliability

refers to the consistency, dependability, and stability of the measurement process

7
New cards

Validity

is the degree to which a test or assessment measures what it is intended to measure. Using a valid instrument increases the chances of measuring what was intended.

8
New cards

Variables

are operational forms of a construct. Researchers use variables to designate how the construct will be measured in designated scenarios

9
New cards

Unit of Analysis

is what or who is being studied or evaluated ( the individual, group, organization, or program)

10
New cards

Formative Evaluation

is a process that evaluators or researchers use to check an ongoing process of evaluation from planning through implementation phases.

11
New cards

Process Evaluation

is any combination of measures that occurs as a program is implemented to assure or improve the quality of performance or delivery

12
New cards

Summative Evaluation

is often associated with measures or judgements that enable the investigator to conclude impact and outcome evaluations

13
New cards

Impact Evaluation

is the immediate and observable effects of a program leading to the desired outcome

14
New cards

Outcome Evaluation

focused on the ultimate goal, product, or policy and is often measured in terms of health status, morbidity, mortality

15
New cards

Inputs

are the resources, contributions, and other investments that go into a program

16
New cards

Activities

are the keystones of the program

17
New cards

Outputs

are the activities, services, and products that will reach the participants of a program as a result of carefully leveraging resources through skillful planning

18
New cards

Outcomes

are often stepwise and labeled short-term, intermediate, or long-term outcomes.

19
New cards

Short-term outcomes

sometimes described as impact- are quantifiable changes in knowledge, skills, and access to resources that happen if planned activities are successfully carried out

20
New cards

Intermediate outcomes

are measured in terms of changes in behaviors related to disease or health status

21
New cards

Long-term outcomes

are measured in terms of fundamental changes in conditions leading to morbidity or mortality

22
New cards

Purpose Statement

usually a sentence of two written with specificity and detail. It helps evaluators focus and guide efforts involved with data collection and analysis. Used to guide the selection and/or creation of program goals

23
New cards

Goal

usually long-term and represent a more global vison (e.g to reduce morbidity or mortality)

24
New cards

Objectives

Define measurable strategies used to attain progress towards a goal

25
New cards

Evaluation Questions

precise questions that carefully align with the statement of program operations, intentions, and stakeholders. Helps to establish boundaries for the evaluation by stating what aspects of the program will be addressed

26
New cards

Process Questions

helps the evaluator understand phenomena: such as internal and external forces that affect program activities.

27
New cards

Long-term evaluation questions

provide vital links between intervention activities, products, and services rendered, and changes in risk factors, morbidity or mortality

28
New cards

Well-developed evaluation questions

offer a guide for selecting appropriate data sources, which in turn, help to guide an effective analysis plan

29
New cards

Quantitative Methods

are focused on measuring things related to health education programs using numerical data to help describe, explain, or predict phenomena

30
New cards

Qualitative Methods

are descriptive with the aim of the researcher/ evaluator to discover meaning or insight

31
New cards

Probability Sampling Techniques

are those methods in which each member of the priority population has a known chance, or probability, of being selected.

32
New cards

Simple random sampling

an inclusive list of the priority population is used to randomly (such as with a list of random numbers) select a certain number of potential participants from the list

33
New cards

Systemic random sampling

an inclusive list of the priority population is used, and starting with a random number, every nth potential participant is selected ( such as every 14th participant)

34
New cards

Stratified random sampling

the sample is split into groups based on a variable of interest, and an equal number of potential participants from each group are selected randomly ( such as in a simple random sample)

35
New cards

Cluster sampling

is when naturally occurring groups (such as schools) are selected instead of individuals

36
New cards

Multistage cluster sampling

in several steps, groups are selected using cluster sampling (i.e in a state, counties are selected at random, then schools within the county are selected are random)

37
New cards

Stratified multistage cluster sampling

in several steps, a variable of interest is used to split the sample, and then groups are randomly selected from this sample (i.e., in a state, counties are selected at random, then an equal number of elementary schools and secondary schools are randomly selected in each county)

38
New cards

Non-probability sampling

not all units from the priority populations have an equal change of being selected and thus their representativeness to the population is unknown

39
New cards

Non-probability sampling technique

Convenience

selection of individuals or groups who are available

40
New cards

Non-probability sampling technique

Purposive

researcher makes judgments about who to include in the sample based on study needs

41
New cards

Non-probability sampling technique

Quota

Selecting individuals who have a certain characteristic up to a certain number (i.e. selecting 50 females from the worksite)

42
New cards

Non-probability sampling technique

Network Sampling (also called snowball sampling)

when respondents identify other potential participants who might have desired characteristics for the study

43
New cards

Validity

the degree to which a test or assessment measures what it is intended to measure

44
New cards

Reliability

refers to the consistency, dependability, and stability of the measurement process

45
New cards

Pilot Test

is to gain insights on whether a data collection instrument consistently measures whatever it should measure.

46
New cards

Statement of purpose

used to clearly and succinctly define the goal of the research project

47
New cards

Elements of a purpose statement include the following:

  • research design (quantitative study) or method of inquiry (qualitative)

  • Variables (quantitative study) or phenomena under investigation (quantitative)

  • The priority population

  • Research setting (e.g., university, worksite)

48
New cards

Research Question

is an interrogative statement that reflects the central questions the research study is designed to answer

49
New cards

Hypotheses

Quality research questions are developed in such a way that can be translated into a testable statement

50
New cards

Null hypothesis

is a hypothesis of skepticism, in which it is stated that there is no relationship between variables

51
New cards

Alternative hypothesis

it is stated that there is a relationship between variables.

May also be directional, for ex. if the research team theorizes a program may reduce or increase the quantity or a targeted behavior

52
New cards

Quantitative research

inferential statistical tests are used to determine if differences or relationships exist between variables

53
New cards

Statistical Test

is a procedure that, when data are fed into, is used to either reject or fails to reject a null hypothesis

54
New cards

Fundamental concepts for human subjects protection

  • Respect for persons (protection of individual autonomy and for those who have diminished autonomy)

  • Beneficence (protecting people from hard and working toward enhancing well-being)

  • Justice (equals should be treated equally)

55
New cards

Informed Consent

designed to allow participants to choose what will or will not happen to them, and it is signed by participants to indicate their choice

56
New cards

Informed consent includes the following information

  • nature and purpose of the program

  • any inherent risks or dangers associated with participation in the program

  • any possible discomfort that be experienced from participation in the program

  • expected benefits of participation

  • alternative programs or procedures that would accomplish the same results

  • option of discontinuing participation at any time

57
New cards

Institutional Review Board (IRB)

function is to ensure physical and psychological protection of human subjects involved in research

It reviews, approves, and monitors biomedical and behavioral research involving humans

58
New cards

Logic Model

can be created and used as an evaluation tool to facilitate evaluation design decisions that will impact or influence the type of data and analysis available

Explore top flashcards