Sociological Research: 2.1–2.3 Quick Notes

2.1 Approaches to Sociological Research

  • Sociology uses either the scientific method or an interpretive framework to study societies and social interactions.
  • Key goals:
    • Gather empirical evidence to reduce subjectivity and bias.
    • Use peer review to validate conclusions.
  • Core process: start with a question about how/why things happen; choose a design rooted in either a scientific approach or an interpretive framework.
  • The Scientific Method
    • Involves developing and testing theories based on empirical evidence.
    • Emphasizes systematic observation, objectivity, critical thinking, skepticism, and logical reasoning.
    • Typically described as having $6$ steps:
    • Step 1: Ask a Question or Find a Research Topic
    • Step 2: Review the Literature/Existing Sources
    • Step 3: Formulate a Hypothesis
    • Step 4: Design and Conduct a Study
    • Step 5: Draw Conclusions
    • Step 6: Report Results
    • Outcomes provide explanations of behavior and access to knowledge of cultures, rituals, beliefs, trends.
    • In sociological research, study questions focus on social characteristics and outcomes (e.g., well-being, cohesion, wealth, crime).
  • Interpretive Framework
    • Seeks to understand social worlds from the viewpoints of participants.
    • Descriptive or narrative findings, not strict hypothesis testing.
    • May involve direct observation and storytelling; methods can adapt during the study.
  • Reliability, Validity, and Operational Definitions
    • Reliability: likelihood that results can be replicated with the same methods/tools.
    • Validity: whether the study measures what it intends to measure.
    • Operational Definition: concrete, observable criteria for measuring a concept.
    • Operationalization ensures consistency and comparability across researchers.
  • Hypotheses and Variables
    • Hypothesis: educated guess about the relationship between phenomena.
    • Independent Variable (IV): cause; Dependent Variable (DV): effect.
    • Example: ext{If unemployment increases, then crime increases} (IV = unemployment; DV = crime rate).
    • Common relation form: IV influences DV; researchers identify both in advance.
  • Designing and Conducting a Study
    • Designs should maximize reliability and validity; plan to minimize bias.
    • Consider generalizability and limitations of data.
  • Drawing Conclusions and Reporting Results
    • Data analysis leads to conclusions about theory or policy.
    • Even when hypotheses are not supported, findings contribute to knowledge.
  • Interpretive Sociology vs. Scientific Method
    • Interpretive: depth, context, subjective experiences; less emphasis on generalizable results.
    • Scientific: seeks generalizable patterns and testable hypotheses.
  • Critical Sociology
    • Deconstructs how power, class, race, gender shape research and theory.
    • Emphasizes that research is not value-free; seeks liberation from inequality.

2.2 Research Methods

  • Four main kinds of research methods in sociology:
    • Surveys
    • Field research
    • Experiments
    • Secondary data analysis
  • Surveys
    • Collect data from responses to questions (questionnaires or interviews).
    • Can yield quantitative (numerical) data and qualitative (open-ended) data.
    • Important to use a representative population sample (random sampling).
    • Instruments: questionnaires or interview guides; anonymity can improve honesty.
    • Strengths: can reach large samples; captures what people say they think/feel.
    • Limitations: may not reflect actual behavior; response rates and question design matter.
  • Field Research
    • Data gathered in natural environments outside a lab or library.
    • Methods include:
    • Participant observation
    • Ethnography
    • Case study
    • Strengths: rich, real-life information; context and meaning.
    • Limitations: small samples; harder to infer causation; data are often qualitative.
    • Field research emphasizes observing how people think and behave in natural settings.
    • In practice, researchers balance overt vs. covert presence; caution about ethical concerns.
  • Ethnography and Institutional Ethnography
    • Ethnography: immersion in a social setting to understand everyday life and culture from insiders’ perspectives.
    • Institutional ethnography (Dorothy E. Smith): analyzes everyday relations, focusing on power structures and women's experiences; feminist-oriented approach.
  • Middletown: A Study in Modern American Culture (Lynds)
    • Classic ethnographic case showing how industrialization/urbanization reshaped a typical U.S. town (Muncie, Indiana).
    • Demonstrates real-world ethnography influencing public understanding of social change.
  • Case Study
    • In-depth analysis of a single event/person/group.
    • Pros: deep insight; cons: limited generalizability.
    • Useful for unique cases (e.g., feral children like Oxana Malaya).
  • Experiments
    • Test hypotheses about cause-and-effect relationships.
    • Two main types: lab-based experiments and natural/field experiments.
    • Lab experiments: high control; potential Hawthorne effect (participants alter behavior because they know they’re being studied).
    • Field experiments: conducted in real-world settings; more external validity but less control.
    • Design elements: experimental group vs control group; manipulation of IV; measurement of DV.
    • Example: Heussenstamm (1971) bumper sticker study on police stops; limitations ended the study due to funding and participant dropouts.
  • Hawthorne Effect
    • Behavior changes due to awareness of being observed.
  • Secondary Data Analysis
    • Uses preexisting data (government stats, archival records, etc.).
    • Pros: cost-effective; allows re-interpretation; nonreactive research.
    • Cons: data may not fit the new research question; accessibility and accuracy concerns.
  • Strengths and Challenges by Method (summary ideas)
    • Survey: broad reach; limited behavior capture; relies on self-report.
    • Field: rich context; less generalizable; time-intensive.
    • Experiment: causal inference; ethical considerations; Hawthorne effect.
    • Secondary data: efficiency; potential misalignment with new questions; data quality concerns.

2.3 Ethical Concerns

  • Core principle: researchers must avoid harming participants; protect privacy and dignity; obtain informed consent; ensure confidentiality.
  • Value neutrality and objectivity
    • Max Weber argued for value neutrality in interpretation; many sociologists question complete objectivity but strive to minimize bias.
  • American Sociological Association (ASA) Code of Ethics
    • Maintain objectivity and integrity; respect privacy; protect participants from harm; preserve confidentiality; seek informed consent; acknowledge assistance; disclose funding.
  • Notable unethical studies (summaries)
    • The Tuskegee Syphilis Study (1932–1972): misled participants about diagnosis and treatment; denied penicillin when available.
    • Henrietta Lacks (1951): HeLa cells used without consent; contributed to medical advances.
    • Milgram Obedience Studies (1961): extreme stress and deception in obedience experiments.
    • Stanford Prison Experiment (1971): harm to participants; questions about ethical treatment and validity.
    • Laud Humphreys Tearoom Trade (1960s): covert observation; misrepresented identity; serious ethical concerns.
  • Nonreactive vs reactive research
    • Nonreactive (unobtrusive) data: does not involve direct contact or influence behavior (e.g., secondary data, content analysis).
  • Ethics in funding and publication
    • Researchers must disclose funding sources and avoid conflicts of interest.
  • Practical questions for ethics
    • What kinds of studies risk harming participants? What safeguards are necessary? How can researchers balance knowledge gains with participant protections?

Key Terms

  • accuracy – the extent to which a measurement reflects the true concept.
  • case study – in-depth analysis of a single event, situation, or individual.
  • code of ethics – ASA guidelines for ethical research and responsible scholarship.
  • content analysis – systematic analysis of secondary data to extract relevant information.
  • correlation – a relationship where two variables move together, not necessarily causation.
  • debunking – exposing falseness or flawed reasoning.
  • dependent variables – the outcome that is measured.
  • empirical evidence – data gathered through direct observation, measurement, or experimentation.
  • ethnography – immersive study of a social group from inside its environment.
  • experiment – the testing of a hypothesis under controlled conditions.
  • field research – data collection in a natural environment outside the lab.
  • Hawthorne effect – behavior changes due to awareness of being observed.
  • hypothesis – testable educated guess about relationships between variables.
  • independent variables – factors that cause change in the dependent variable.
  • interpretive framework – approach seeking in-depth understanding via participant observation.
  • interview – one-on-one conversation used to collect data.
  • literature review – survey of existing research on a topic.
  • nonreactive research – data collection that does not influence the subject’s behavior.
  • operational definitions – concrete criteria used to measure a concept.
  • participant observation – researchers immerse themselves in a group to observe.
  • population – entire group of interest in a study.
  • primary data – data collected firsthand by the researcher.
  • qualitative data – non-numerical, descriptive data.
  • quantitative data – numerical data analyzed statistically.
  • random sample – each member of the population has an equal chance of selection.
  • reliability – consistency of a measure across time and items.
  • samples – a subset of the population representing the whole.
  • scientific method – a structured approach to research involving question, literature, hypothesis, design, data, and conclusions.
  • secondary data analysis – interpreting data collected by others.
  • surveys – data collection from respondents via questions.
  • validity – whether a measure accurately reflects the concept.
  • value neutrality – impartiality in collecting and presenting data.