CRIM 220 - Lecture 2

Is-Ought distinction and objectivity in social science

  • Social science research focuses on what is (descriptive reality) rather than what ought to be (normative judgments). Reading ahead helps you see these concepts as familiar rather than new concepts being introduced.
  • The is-ought distinction: scientific theory cannot settle debates about value or worth. It can address what is by observing and analyzing patterns, not decide what should be done.
  • Value-free inquiry is tied to treating observations as objective patterns, but subjectivity can influence interpretation. A key point: social regularities are probabilistic, and there can be exceptions to any given regularity.
  • Example of a value-free question with policy relevance: deterrence and the death penalty. If we study deterrence, we can ask whether the death penalty reduces crime, but the finding may still be controversial.
  • Death penalty research findings (summary from lecture): with a death penalty in a state, you are not necessarily less likely to commit first-degree murder than in states without it. In other words, the death penalty is not a reliable deterrent according to the bulk of evidence discussed.
  • Observations about groups and risk: certain demographic groups show higher probability of offending, even though exceptions exist for individuals within those groups.
  • Observation vs generalization:
    • Social regularities are probabilistic; a general pattern does not have to hold 100% of the time.
    • You can discuss probability and likelihood rather than certainties for individuals.
    • Example observation: among age 15–24, males are the highest risk group for offending, but not every individual in that group will offend.
  • Probabilistic patterns vs exceptions: an exceptional case does not invalidate the pattern; it just means the pattern is not universal.

Purposes of research in criminal justice

  • Research serves multiple purposes, often more than one within a single study.
  • Two main categories of research:
    • Pure (basic) research
    • Applied research

Pure (basic) research types

  • Exploration: gather data to explore issues that are not well understood yet; goal is to generate understanding of potential causes or mechanisms.
    • Example: exploring factors that might be responsible for a behavior or pattern observed in crime statistics.
  • Description: describe the scope of an issue or problem; often involves counting, documenting, or systematic observations.
    • Example questions: How many gay members are there in the Lower Mainland? How many people engage in gang-related activities without being gang members?
  • Explaining: findings have direct policy implications; aim to build causal explanations or mechanisms linking variables.

Applied research types

  • Evaluation: look at outcomes of a specific program or policy to see intended and unintended effects.
  • Policy analysis: predict outcomes of different policies when implemented; involves if-then reasoning about potential futures (e.g., what happens in five, ten, or twenty years if policy X is changed).
  • Applied research continually connects to real-world decisions and policy implementation.

The traditional model of science

  • Three major elements, in this order:
    1) Theory: a set of ideas or explanations about why something happens. Example: aggressive driving can be reduced by increased monitoring.
    2) Operationalization: specify the exact procedures, steps, variables, and measurement methods used to study the phenomenon.
    3) Observation: collect data and measure variables to test the theory.
  • Example workflow: Theory → Operationalization (define and measure aggressive driving) → Observation (collect data on driving behavior under monitoring).
  • Important distinctions:
    • Theory: a general explanation or set of concepts and proposed relationships among them.
    • Hypothesis: a testable statement derived from a theory that can be supported or refuted by data.
    • Construct: an abstract concept (e.g., education) that needs to be operationalized into measurable terms.
    • Variable: a concrete representation of a construct (e.g., highest degree earned, years of schooling).

Constructs, variables, and hypotheses

  • Constructs are abstract concepts used in theories (e.g., education, aggression, social integration).
  • Variables are concrete representations of constructs that can be measured (e.g., number of years of schooling, presence of a license, self-reported aggression score).
  • Hypotheses are testable statements derived from theories; they specify expected relationships between variables.
  • The mapping from theory to hypothesis to data is a core part of scientific inquiry.

Unit of analysis and common pitfalls

  • Unit of analysis vs level of aggregation: statements about groups/regions do not automatically apply to individuals (ecological fallacy). Conversely, making broad inferences about groups from individual cases can be an exception fallacy.
  • Ecological fallacy: drawing conclusions about individuals from group-level data.
  • Exception fallacy: drawing broad conclusions about a group from a small number of exceptional cases.
  • These fallacies highlight the importance of aligning units of analysis with the questions being asked.

Research design: timing and measures

  • Cohort studies and panel (longitudinal) studies involve two or more points in time.
  • Attrition: a major challenge in panel studies where participants drop out over time; can be due to deliberate withdrawal, unavailability, incarceration, or death.
  • Baseline vs post-test measures: collecting data before an intervention and after a period during which an intervention (e.g., watching certain movies) occurs to assess impact.
  • Example setup (illustrative): pretest measures of attitudes toward sexual behavior, exposure to different types of movies, followed by post-test measures six months later to assess changes.
  • Data processing and analysis: the lecturer notes that data analysis was not covered in depth in that class, with coverage planned for a later week.

Example application: media exposure and sexual behavior (conceptual study design)

  • Research question (conceptual): Does watching a particular type of movie influence attitudes toward sexual behavior or actual sexual behavior among junior high students?
  • Define a precise construct for sexual abstinence (e.g., inclusivity of what constitutes sexual activity). Do you include all forms of sexual behavior or only intercourse?
  • Define the population scope (e.g., junior high students in Canada or globally).
  • Pretest: assess baseline attitudes toward sexual behavior.
  • Intervention: exposure to specified movies.
  • Post-test (e.g., six months later): re-administer the same attitudes questions to detect changes.
  • Data analysis: examine whether the movie type is related to attitudes or behaviors; this would constitute evidence regarding the hypothesis.
  • Ground rules: this is an illustrative design; actual data analysis techniques would be covered later in the course.

Deductive vs inductive (bottom-up) reasoning in research

  • Deductive reasoning (top-down): start with a theory, derive hypotheses, collect data to test them, and see whether data support the theory.
    • Typical flow: Theory → Hypotheses → Data → Support/Refute Theory.
  • Inductive reasoning (bottom-up): begin with observations, develop tentative hypotheses, and ultimately formulate general theories.
    • Typical flow: Observations → Tentative Hypotheses → General Theory.
  • In this course, both approaches are used, but the traditional model emphasizes theory-driven (deductive) testing of hypotheses.

Policy-relevant research and if-then logic

  • In criminology, theory guides basic research, and findings inform policy development.
  • Policy analysis often uses if-then statements: If policy X is implemented, then outcome Y is expected to occur.
    • Example: If a crime prevention program is implemented, then student crime rates will decrease.
  • A policy or program is often framed as a hypothesis about future outcomes that can be tested with data.

From theory to measurement: constructs, variables, and hypotheses revisited

  • Constructs (abstract concepts) become variables when operationalized for measurement.
  • Hypotheses are testable statements linking variables in a way that allows empirical testing.
  • Examples to distinguish:
    • Construct: education level.
    • Variable: highest degree earned (e.g., high school diploma, bachelor’s, master’s, etc.).
  • Practical notes for exams/papers: be sure you can identify the construct, the measurement (variable), and how you would test the hypothesis with data.

Common questions and quick references for exams

  • What is a theory? A systematic explanation for observed facts and laws related to a particular aspect of life; a set of concepts and proposed relationships among those concepts.
  • What is a hypothesis? A testable statement derived from a theory about the relationship between variables.
  • What is a construct vs a variable? A construct is an abstract idea; a variable is a measurable representation of that construct.
  • What is the ecological fallacy? Inferring individual-level conclusions from group-level data.
  • What is the exception fallacy? Inferring group-level conclusions from a few exceptional cases.
  • What are the two main categories of research? Pure (basic) research and applied research.
  • What are the two main forms of applied research? Evaluation and policy analysis.
  • What are the three elements of the traditional model of science? Theory, operationalization, observation.
  • What is attrition in panel studies? The loss of participants over time, which can bias results if not properly handled.
  • How do deductive and inductive reasoning differ? Deductive starts with theory and tests hypotheses; inductive starts with observations and builds theory.

Quick glossary (terms you should know)

  • Is-ought distinction: separation between descriptive statements about what is and normative judgments about what ought to be.

  • Objectivity: attempting to minimize bias in observations and interpretations.

  • Probabilistic regularities: patterns that describe likelihoods rather than certainties.

  • Construct: abstract concept used in theory.

  • Variable: observable, measurable representation of a construct.

  • Hypothesis: testable statement derived from theory.

  • Ecological fallacy: error in reasoning from group data to individual inferences.

  • Exception fallacy: drawing broad conclusions from a few exceptions.

  • Attrition: loss of participants over time in longitudinal studies.

  • Cohort study: a study that follows a group of people over time.

  • Panel study: another term for longitudinal study following the same individuals.

  • Evaluation: assessing the outcomes of a program or policy.

  • Policy analysis: predicting outcomes of policy changes and guiding decision-making.

  • Deductive reasoning: theory → hypotheses → data → conclusion.

  • Inductive reasoning: observations → hypotheses → theory.

  • Connections to real-world relevance: The material connects theory to policy implications in criminal justice (e.g., deterrence, death penalty) and emphasizes that research informs decision-makers while remaining grounded in empirical evidence.

***Note: All mathematical expressions here are presented in LaTeX format for clarity where numerical or algebraic elements were referenced. For example:

  • Highest-risk group: males aged 15 \, \le \, \text{age} \, \le \, 24.
  • If-then policy expectation: P( ext{crime} \,|\, \text{policy}) \, < \, P(\text{crime} \,|\, \neg \text{policy}).
  • Age and probability concepts are treated as probabilistic estimates rather than certainties in individual cases.