1/132
Looks like no tags are added yet.
Scientific Method
logical and systematic methods to explore research questions; a set of procedures to generate new knowledge
General Approaches to the scientific method: Empiricism vs Intuition & Skepticism vs Cynicism
Empiricism: having data to back up your claims
Skepticism: skeptics need to follow the data & don't adopt an opinion until the data speaks
Cynicism: cynics can't be convinced no matter how much data is present
How do we think like a researcher?
1) Recognize the complexity of science
2) have faith in measures, procedures, peers, and participants
3) Be wary of causal assertions and the difference between correlation and causation
Correlation
The degree to which two factors change with one another
Causation
The degree to which a change in one factor produces a change in another
Hypotheses
- Tentative and testable explanation for an event or phenomenon
- Prediction about the relationship between variables and associated outcomes
Where do hypotheses come from?
Exceptions to the rules, how someone would act in a given situation, new solutions to old problems, alternative explanations for an event, elaborations on other theories
What is a concept?
A clearly defined operationalization of research constructs (things such as attitudes or feelings, events such as behaviors or interventions, or relationships such as corer relations between variables).
i.e., the definition of sadness
What is observation in the scientific method?
The process of assessing and recording aspects of the phenomena under investigation
must be systematic and controlled
What are the different types of instruments and measurements in the scientific method? How do they differ?
1) Physical vs Psychological
Physical: we use physical measurements when the construct under investigation has observable physical attributes (heaviness, length, distance)
Psychological: when the construct is invisible, i.e., measurements of sadness, depression, or anxiety-- become much harder to measure
2) Accuracy and Precision
Precision: how much error there is. The less error, the more precise
Accuracy: the difference between what we are trying to measure and what the measuring tool is telling us
(i.e., multiple darts clustered together away from the bullseye is very precise but not accurate). We want our measures to demonstrate both accuracy and precision
3) Validity and Reliability
Validity: Truthfulness (is our measurement actually measuring what we are trying to assess?)
Reliability: replicability or consistency
4)Sensitivity and Specificity
Sensitivity: Tells you when something is ther (ability to designate a positive)
Specificity: Tells you when something is not there (Ability to designate a negative)
Reporting observations vs reporting conclusions and inferences
reporting observations: telling the audience what our data says. fact-based (the methods, results, and stats)
reporting conclusions and inferences: the stories we tell based off of our data (i.e., the discussion)
The discussion of a research paper would be an example of ______ and the methods, results, and stats would be an example of ____
Discussion is an example of conclusions and inferences, the methods, results, or stats is an example of observations
What are the goals of research?
1) Exploration
2) Description
3) Prediction
4) explanation/ understanding
5) Action/ Effecting Change
Goals of research: Exploration
Research designed to determine whether a phenomenon exists
(Looking at public schools to see the rate of bullying)
Goals of research: description
Research that examines a phenomenon to characterize it and differentiate it from other phenomena
(Seeing bullying in public school vs private)
Goals of research: prediction
Research designed to identify relationships between variables, allowing knowledge of one to facilitate estimations on the other
(Seeing if there is a relationship between the prevalence of bullying and type of schools)
Goals of research: explanation/ understanding
Research that examines cause and effect relationships between variables
(See if private schools have more or less levels of bullying)
Goals of research: action/effecting change
Research conducted to solve a social problem
(See if there are ways to stop bullying in schools)
Steps to getting started in research
1) Pick a topic
2) Develop hypotheses
3) Review the literature (determine what progress has already been made and build upon it, identify inconsistencies and limitations in previous work, search for practical applications of theoretical notions, be wary of existing research
4) Question the importance of the research. Is it actually important or just interesting?
5) Study Design and Development
Research Ethics
Concern for balancing a researcher's right to study a phenomena with the right of participant to be protected from abuse
Whose responsibility is ethical research?
the researcher, the IRB, the institution where it's done, the participants, etc
Numberg Trials and significance
military trials held after WW2 to prosecute those who participated in the Holocaust
23 doctors were charged with crimes for performing medical experiments on concentration camp inmates without their consent. The doctors murdered, tortured, and were extremely brutal to the inmates
A lot of rules and regulations for ethical research stemmed from this.
Nuremberg Code (1948)
Set of ethical principles for human-based research. There had to be:
1) voluntary consent
2) benefits must outweigh any risks
3) participants must be able to withdraw at any time
What were some unethical research studies that took place even after the Nuremberg code?
Willowbrook
Harvard "Breaking Study"
Jewish Chronic Disease Hospital (1960s):
Milgram Obedience Study
Standford Prison Experiment
Humphries Sexual Behavior Study (1970
Tuskegee Syphilis Study
Willowbrook study (1950)
developmentally delayed children infected with hepatitis to study potential treatments
-No consent
-Benefit does not outweigh the risk
-Couldn't withdraw
Harvard "Breaking" study
the govt authorized this study to study why soldiers that came back to the US after WW2 appeared brainwashed. Berated participants for a year to see if they would "break"
-deception (consent)
Jewish Chronic Disease Hospital (1960s)
Live cancer cells injected into 22 senile patients
-consent, withdraw, benefits?
Milgram Obedience Studies
study of the phenomenon of obedience to an authority figure, examinfed the effects of punishment on learning (shock treatment for mistakes, 65% shocked dangerous amounts when ordered
Standford Prison Experiment
philip zimbardo randomply assigned college students to either be prisoners or guards and watched how they became the characters they were assigned--lasted only 6 days
Humphries Sexual behavior Study (1970)
Wanted to study who was frequenting the prostitutes, stalked them & pretended to be health service provider to get more info
Problem: deception, participants did not give consent or even know they were in a study
Tuskegee Syphilis Study (1932-1972)
399 African American sharecroppers were infected with syphilis & never told of their condition or offered treatment, even after treatments were developed. many died. The government ordered this study, but they ordered it before the Nuremberg code and it continued into after the Nuremberg code. Gov't tried to say the code did not apply since study was started before
The National Research Act (1974)
Outlined the process by which research was to be followed since the Nuremberg code was not being adhered to. Included the Belmont Report &. mandated that all federally funded research institutions have an IRB, Institutional Animal Care and Use Committee (IACUC)
The Belmont Report
Called for these 3 things in research:
1) Respect for Persons: "participants" instead of "subjects"
2) Beneficence: minimize harm in research
3) Justice: be mindful of the power imbalance (the people being studied are usually minorities, students, etc)
Institutional Review Board (IRB): what is it, and who must be included?
Responsible for evaluating all human research to ensure the rights of participants are protected. Has to have:
-5 members of varying background
-at least one non-scientist
-at least one community member
-at least one expert on any involved vulnerable population (these ppl are not sanctioned members of the IRB so they can't vote but can give recommendations)
Why is the IRB so nitpicky?
B/c each approval of research is a potential lawsuit-- if IRB grants approval, the researcher cannot be sued but every single member of the IRB can be
In what situation would you not need IRB approval?
If your research is not federally funded (i.e., research done by private companies). In this situation, unethical research can still be sued-- like the one school who had someone do unethical research then every single future study got shot down, no more dissertations, etc
How do you determine risks?
Evaluate the costs and benefits to both the individuals and society
Types of risks involved in research?
1) Physical and Psychological
- psychological risks of studies (trauma, stress, induced mood states)
2) Social and Legal
social risks: studying socially sensitive topics like race, political ideology, or minority groups
legal risks: private client information, academic performance, asking people to reveal illegal or criminal histories, study acts as a catalyst for illegal behavior
3) Risk of not conducting the research
What is minimal risk?
A level of risk that is no more than what one would expect to encounter in everyday life (i.e., if you give a survey and people experience eyestrain, this is no more than what they would experience in everyday life)
Minimal risk varies with ______
the characteristics of the population (i.e., the risk for a geriatric population might not be the same as a younger population)
How do you deal with risk in a study?
1) recognize that it's there
2) confidentiality/ anonymity (different levels of confidentiality, confidentiality is different from anonymity bc researcher knows identity but won't share, anonymity: you don't share your identity)
3) reduce fear of nonparticipation: making sure participants know they can withdraw at any point
4) be upfront about what is and isn't being measured: tell participants as much and as early as you can
5) debriefing: telling participants everything you were doing after the study & giving participants one more chance to withdraw.
6) providing resources: provide resources to deal with the effects of the study (i.e., if you are studying depression and worried that your tests might trigger a depressive episode)
What is incomplete disclosure?
Researcher doesn't tell participant anything that is not true, they just don't tell participants the whole story, They don't reveal some parts of the story in order to reduce bias
(Telling people you are looking at social attitudes when you are actual looking at racism)
In what case would you use deception in a study?
When telling the participants what's happening will change how they respond.
What is a risk to not telling participants what is and what isn't being studied?
The data might be compromised b/c participants are trying to figure out what you're studying the whole time
Dealing with Risk: Informed Consent/ Study Information Sheets
A declaration of what the research entails, including any risks involved in participating and any factors that may influence willingness to participate
Includes:
- purpose of study
-what they will be asked to do
-time requirements
-potential risks and benefits
-limits to confidentiality/anonymity
-resources for dealing w/ negative repercussions
-participation is voluntary, they can withdraw at any time
-contact information
-compliance manager
-justification of the remaining risks (risk/benefit ratio)
Dealing with Risk: Deception and Debriefing
Deception vs Incomplete Disclosure (deception is lying, incomplete disclosure is not lying but not giving the whole story)
Deception is appropriate if participants knowledge of study might change how they respond
Dangers of overuse:
Debriefing
"taking care" of participants after study by:
informing participants of any deception, educating participants about the research, providing rationale for study and related research, providing resources for any potential negative consequence of participation
Examples of unethical reporting of research
1) Scientific Fraud: "fudging" the numbers to lean a certain way
2) Inaccurate Reporting: violates rules of scientific progress (i.e., presenting data in a way that is misleading, "cherry picking" findings)
3) Unethical Reporting of publication credit, authorship order, and acknowledgments when someone helps with your research
4) Plagiarism: presenting the work, words, and ideas of others as your own
Ethical Research in 5 easy steps!
1) Find out all the facts: what will be involved, who are the participants
2) Identify any ethical concerns (vulnerable populations, state/federal guidelines)
3) Determine what is at stake (who are the stakeholders-- participants, society, university, etc)
4) Explore alternative methods: is there a way to pursue this with less risk? Is it ethical to not conduct the work?
5) Decide on a course of action: have you done your best to be ethical?
What is risky is often determined by the...._____
population of study
Why are synonyms important for finding more relevant research?
Could retrieve more relevant results. Use subject-specific language
What are some search tricks when looking for articles?
-Asterisk: Find all the various endings of a word (religio* would find religion, religiosity, religious)
-AND: Use between concepts to limit your research
-Quotes: search for the exact phrase "Disruptive mood dysregulation disorder"
-Parentheses and OR's: Group concepts and search for multiple synonyms at once and expands search
Primary vs Secondary Sources
Primary: author creates new knowledge through experiments, data collection, observation, etc (quantitative and qualitative studies)
Secondary: author analyzes other authors research (literature reviews, systematic reviews/meta-analysis)
Examining for research integrity
- is rationale/novelty of the work explained?
-is the literature review complete/ were certain articles excluded from researcher bias?
-is the methodology sound? can it be replicated?
-are the results reasonable given methodology?
-are conclusions consistent with the results?
-does the reference list include seminal works?
for journals, see if it is peer reviewed or affiliated with a well-known professional society
Seminal studies
Important to include in your research. they are more likely to be older, cited in lots of articles by other authors
Difference between backward and forward citation chaining
Backward: Look at the citations that are listed in your source
Forward: If you find an article, see who else it is cited by and look at their work
When do you stop searching for more research?
When adding new terms retrieves no new relevant results or decreases relevancy, removing terms eliminates relevant results, all known articles are retrieved in the search, or review of your search strategy by librarians or peers does not identify significant revisions or new relevant material
Research Design
A plan to answer a research question
Blueprint of a study
Research Methods
A strategy used to implement that plan
Specifications used to "build" what described in the blueprint
Designs where data is collected from the participant(s)
Surveys
Interviews
Field Research
Case Studies / Small nResearch
Designs where the data is not collected from study participant(s)
Literature Reviews
Meta-Analyses
Archival Studies
Indirect Observational Methods
Survey Research Designs
Cross-sectional
Longitudinal
Successive Independent
Cross-Sectional Design
One or more samples drawn from the population at the "same" time
PRO: Less time consuming, easier
CON: Can’t measure change in population, Cohort effect -This means that the cultural and sociological effects on a specific cohort may confound the results of the study.
(Studying the prevalence of depressions in Americans)
Longitudinal Design
The same sample of participants is surveyed on multiple occasions
PRO:Can track change over time
CON: Respondent Mortality (Attrition)-people leaving, expensive
(Studying child development from age 0-15)
Successive Independent Samples Design
Multiple samples drawn over time
Noncomparable Samples- change happening for a small group only
Questionnaires
Survey Responses- Assumption that people are willing to provide truthful and accurate responses
Social Desirability- answer in ways you are expected to/ want to be seen
Self-report scales
allows people to provide accounts and assessments of their own perspectives
Data that can be measured via surveys
• Demographic Variables -General characteristics of people who complete a survey
• Preferences and Attitudes - A favorable or unfavorable reaction toward something
• Knowledge - Objectively verifiable information
• Past Behaviors - Actions that have already been performed
Consideration for survey research
• Most survey research is done online
• Potential participants are invited to visit website and complete a questionnaire
Advantages of surveys
Cost effective
time efficient
Confidential
Disadvatages of Surveys
Response bias- only those with strong opinions will respond
Response Rates-cast wide net but not get many responses
Personal Interviews
Trained interviewers administer a questionnaire
Schedule, Focused interview, Nondirective interview
Schedule
Survey instrument that are essentially structured interviews
-Very specific set of questions, not very in depth
Focused Interview
Interviewer "molds" the data collection process around certain key questions with flexibility on any individual questions
-Open ended questions, better flow of conversation
Nondirective Interview
Interviewer encourages the respondent to discuss a general topic, but provides little direct guidance
-"talking points"
Advantages of Personal Interviews
-allows for flexibility
-can be very thorough
Disadvantages of Personal Interviews
-Costly
-Respondent Reactivity (Social desirability)
-Interviewer Bias (objectivity)
Field Research
• Collection of research methods that include direct observation of naturally occurring events
• Observational Research
• Events are witnessed or recorded as they unfold
• Emphasis is on natural events
Types of Field Research: Observation without Intervention
[Complete Observer]
Researcher does not participate in the event being observed, and the participants are unaware of researchers presence.
(Observing traffic flow of people in aquarium)
- Observation in a natural setting without any attempt by the researcher to intervene
Types of Field Research: Observation with Intervention [Observer-as-participant]
Researcher is known to the participants, but interacts with them as little as possible
-silent treatment
-military personnel
Types of Field Research: Observation with Intervention [Participant-as-observer]
Researcher is known to and engages with the participants as a neutral third party
- guy researching mafia gangs
Types of Field Research: Observation with Intervention [Complete Participant]
Researcher is fully integrated into the group, who may or may not be aware of it
(Undercover boss)
Types of Field Research: Structured Observation
The researcher intervenes in order to cause an event or create a setting where events can be more easily recorded
-Dropping papers in the middle of Loyola campus
-looking at change blindness
Types of Field Research: Field Experiments
Researcher create and manipulates variable(s) in a natural setting to see their effect on behavior
-two situations
(1) dean wants to see you ASAP (2) Dean wants to see you whenever
-had confederate be hurt and wanted to see who would stop to help
Types of Field Research: Structured Observation and Field Experiments
Both designs are used when something is difficult to study in a natural setting and often make use of confederates
Dangers of Observational Research
• Reactivity - Changes in behavior resulting from awareness of being observed (which can change what is being observed)
• Demand Characteristics - People try to determine what the researcher is looking for and behave in a consistent fashion (people pleaser)
• Methods of reducing reactivity
(Habituation and Desensitization)
• Observer Bias - Systematic errors in observation resulting from an observer's expectations (people see what they want to see, code idea in a way that supports their hypothesis)
• Expectancy Effects - What the observer expects a particular behavior to look like in a given situation
• Methods of reducing observer bias
( Blind and Double-Blind Studies) (Eliminate the participants all together...)
Literature Reviews
• Critical analysis and integration of existing research
• Not just a series of short book reports
• Need to identify themes, outcomes, gaps, and trends in what has been published in order to synthesize something new
Meta-Analysis
• Statistical combination of the results of several different studies investigating the same thing
• Not the same thing as a literature review
• Common dependent measures are identified
• Magnitude of effects of interest are standardized into a common metric
• Aggregate is composed that reflects overall effect across all studies
Advantages of Literature Review and Meta-Analysis
• Data is already collected
• Higher statistical power
• Greater generalizability
• Helps control for variation between
studies
• Helps identify publication biases
Disadvantages of Literature Review and Meta-Analysis
• Very time consuming
• Information overload
• Quality control of underlying studies
• Compatibility of dependent measures
• Inherently correlational
• File-Drawer Problem-research never being published
• Researcher bias-people want to study what they want
Archival Research
• Research based on existing information
(Data was generated before the research began)
(Participants not directly involved)
• Examples: Existing data basis :Census data, School records Legal documents Media (e.g., Newspapers, Television,)
Archival Research: Content Analysis
Identifying the presence and prevalence of particular themes or events
• Consistency and objectivity of observations and inferences
• Sampling
- qualitative data based
Archival Research: Content Analysis
Identifying the presence and prevalence of particular themes or events
Interpretation and coding of observable behaviors
• Manifest Content - Information under investigation is literal and directly observable
(You should brush your teeth)
• Latent Content - Information under investigation is inferred from observable literal material
(Oral hygiene is important)• Identifying the presence and prevalence of particular themes or events
Archival Research: Existing Data Analysis
• Research methodology that involves the examination of information obtained in the context of previously conducted research
- quantitative based
• Raw data
• Summary data
• Analytical findings
Archival research advantages
• Ease and (sometimes) cost of research
• Participant reactivity
• Natural treatment effects- trends over time
Archival Research disadvantages
• Selecting proper / unbiased units of analysis
• Making proper / unbiased observations
• Data quality
(Selective deposit -not everything is there vs. Selective survival- archive got damaged)
("File Drawer Problem")
• Generalizability- may change depending on population
(Ecological Fallacy- apply in more places than it does)
(Reductionism-raw data turned to summary, loss of information)
Indirect Observational design (field research)
• The researcher does not intervene in the situation in any way
• Behaviors and related attitudes are inferred from 'clues' of previous behaviors
Physical Traces- erosion of tiles based on traction
(Use Traces- wear and tear of object)
(Products- what they make)
Variables
A factor that can take on different values
Independent variable
A factor that systematically varies; Studied to determine its impact on another variable
- always change
Research design (methods) is often a matter of what kind of independent
variable(s) are involved
Dependent Variable
Factor under investigation; Measured to evaluate impact of an independent variable
- changes because of the IV
Continuous variable
Variables represent magnitude or position on a scale
-Height in relation to eye-hand movement