Looks like no one added any tags here yet for you.
definition of nursing research
systemic inquiries about nursing practice that include research, testing, and evaluation
designed to develop or contribute to generalizable knowledge
definition of evidence based practice (EBP)
characterized by best research evidence, clinical expertise, and patient preferences and values
EBP goals
lifelong approach to clinical practice
translates knowledge with the goal of improving practice
impact of nursing research VS. impact of EBP
nursing research: growing knowledge base of nursing
EBP: translate the knowledge into practice
nursing research impact
generates the knowledge base for our nursing practice
EBP impact
translate the knowledge base of that we generated through research into our practice
deductive reasoning
top → down approach
uses general premises to reach specific conclusions
used by most nurse researchers
ex:
inductive reasoning
bottom up approach
uses specific observations to form general conclusions
ex:
what is a PICO question?
for writing a good researchable question for EBP or nursing research
what does each letter in PICO stand for
P - population
I - intervention
C - comparison
O - outcome
how to identify POPULATION (P) in PICO
how would I describe this group of patients
ex: age, gender, geographic location
how to identify INTERVENTION (I) in PICO
which main intervention, management strategy, diagnostic test, etc. am I interested in?
how to identify COMPARE (C) in PICO
is there a control or alternative you would like to compare to the intervention?
how to identify OUTCOME (O) in PICO
what can I hope to measure, accomplish, improve, or effect?
what is sampling?
selecting a subset of the population to participate in the research study
why is sampling important?
for generalizability and external validity
we want our sample to be representative of the population
definition of target population
who we ultimately want our results to apply to
ex: All elementary school students in the nation
definition of accessible population
the subset of people from the target population who we could reasonably enroll in our study
ex: Elementary school students in Pittsburgh, PA
definition of sample
the individuals who meet the criteria and enroll and participate in our study
ex: A randomly selected group of 150 students from the accessible population
what affects sampling error?
sample size
heterogeneity/variability
what happens to the sampling error when the sample size increases (increases/decreases)
the sampling error decreases
what happens to the sampling error when variability increases (increases/decreases)
increases
definition of sampling error
the difference between our sample statistic and our population parameter (we really want to know population parameter)
do we want the sampling error to be small or large
SMALLLL
point estimate
our single best guess of the unknown population parameter
confidence interval
the amount of uncertainty around the estimate
can be narrow or wide
do we want the confidence interval to be large or small
SMALLLL so that it is more precise
what does overlapping mean
means there is no difference between the two groups
the treatments and control are equivalent
one is not better than the other
what does no overlapping mean
means that one is higher/better than the other
one is superior
there is a statistically significant difference
P > 5
data collection methods
what types of data would you collect with each method?
definition of reliability
means CONSISTENCY
getting the same thing over and over again
without reliability, we would have no confidence in the data we collect
definition of validity
ACCURACY, TRUENESS
the extent to which the instruments used measure exactly the concept that you want them to measure
types of validity
CONTENT, CRITERION, CONSTRUCT
what is content (type of validity)
involves the degree to which the content of the test matches a content domain associated with the construct
what is criterion (type of validity)
correlation between the test and the criterion available (variables) taken as representative of the construct
what is construct (type of validity)
the extent to which your test or measure accurately assesses what it's supposed to
types of bias from structured observation and surveys
social desirability
recall bias
response bias
extreme response bias
acquiescence
what is a study design?
a guide for the research process
the blueprint
has the structure to maintain the control in the study
Quantitative research
uses numbers
qualitative research
everything BUT numbers
hierarchy of evidence
arranged in terms of internal validity based on the cause and effect
randomized control trial
the top is the most evidence
the independent variable changes the dependent variable
definition of experimental design
the process of carrying out research in an objective and controlled fashion so that precision is maximized
specific conclusions can be drawn regarding a hypothesis
3 required properties of true experimental design
R.M.C.
Randomization
Manipulation
Control
definition of blinding
the concealment of group allocation
keeps groups equivalent in everything but the independent variable
definition of allocation concealment
hides the sorting of trial participants into groups so that this knowledge cannot be exploited
definition of intervention fidelity
participants receive the intervention or instructions exactly as described in the study protocol
definition of intention to treat analysis
analyzing the people in the group they are in no matter if they complete the study or not
you will be analyzed regardless of whether you participate or not
definition of independent variable
the one you can change in an experiment
the cause factor you are testing to see if it affects something else
definition of dependent variable
the results or the effects you measure in the experiment
what you observe or count to see if it changes when the independent variable changes
definition of Quasi-experimental design
research approach that aims to establish a cause-and-effect relationship, but LACKS random assignment of participants to groups.
what makes quasi-experimental design different from an experimental design
DOES NOT have randomization
participants are placed in groups on everything BUT randomization
might NOT have a control group
definition of confounding
some variable that influences the dependent and independent variable
can be measured and analyzed
can also be unmeasured
definition of bias
any deviation from the truth during the process that can lead to FALSE INFORMATION
strengths of quasi-experimental designs
practical
less expensive
more generalizable
weaknesses of quasi-experimental design
not able to truly test the cause and effect
definition of observational designs
researchers observe and record data about a phenomenon or group without intervening or manipulating any variables, allowing them to study naturally occurring events or behaviors
JUST OBSERVING BEHAVIORS
observational vs. experimental/quasi
observational: NO active manipulation
experimental/quasi: active manipulation
definition of cohort study
start with individuals WITH exposure of interest AND individuals WITHOUT exposure
good things about cohort study
calculate incidence, prognosis, natural history
give us temporality
we know the exposure
bad things about cohort study
expensive
time consuming
not good for rare outcomes
definition of case control study
individuals with outcome of interest and individuals without the outcome of interested are identified
these 2 groups are studied retrospectively to compare the frequency of the exposure
good things about case control study
good fit for rare outcomes b/c they are based on outcomes
less expenses
bad things about case control study
limited to ONE outcome
recall bias
sampling bias
definition of cross-sectional study
a group of people is observed, or certain information is collected, at a single point in time or over a short period of time
good things about cross-sectional study
for prevalence
inexpensive
easy
bad things about cross-sectional study
no temporality
sampling bias
not good for rare outcomes
exposure relative to outcome
how close/related that factor that is associated with how the outcome is
timeline terms (3)
prospective - look to the future
retrospective - looking back at time
simultaneous - in the moment
objective of the research definition
concise statement of the specific goals and aims of a research study
definition of internal validity
degree to which change in the dependent variable can be definitely attributed only to the independent variable and not to other variables
definition of external validity
generalizability of findings of experimental people and settings
what is a systematic review
studies of studies
clearly states scientific research methods and designed to minimize bias
can be qualitative
what is a meta-analysis
unique type of systemic review
can be quantitive
how to interpret a forest plot
Each horizontal line on a forest plot represents an individual study with the result plotted as a box and the 95% confidence interval of the result displayed as the line
want the horizontal lines not overlapping with the middle vertical line
3 levels of measurement
nominal
ordinal
interval
what is nominal level of measurement
categories (NO ranking)
Ex: sex, age, height, etc.
what is ordinal level of measurement
categories AND ranked (ranking not equal intervals)
ex: scale → strongly agree, agree, disagree
what is interval level of measurement
equal interval estimates
types of error
type 1
type 2
type 1 error means
false positive
reject a true null hypothesis
type 2 error means
false negative
fail to reject a false null hypothesi
statistical significance VS. clinical significance
Statistical significance : indicates the likelihood that an observed effect isn't due to chance
clinical significance : assesses the practical importance and real-world impact of that effect on patient care
definition of emergent design
the initial plan is NOT tightly describedi
definition of reflexivity
the researcher is an instrument, so the researcher needs to be aware of bias and experiences
definition of purposive sampling
not trying to be generalizable, BUT selecting individuals who will answer
definition of data saturation
how we get our sample size
keep sampling until no new research can be attained
definition of triangulation
using multiple sources to corroborate evidence to increase accuracy
definition of memoing
researchers record their data while they observe
definition of bracketing
the researchers setting aside their bias and interpretation
definition of coding
putting texts into themes and common ideas
definition of phenomenology
aimed at understanding the lived experience of individuals
definition of grounded theory
seeks to develop a theory grounded in data, focusing on social processes and interactions
illustration or figure of a process