Policy Analysis: Evidence Notes

Policy Analysis: Evidence Notes

Research and Policy Analysis: What counts as policy analysis?

  • Policy analysis is a form of applied research focused on informing policy decisions

  • Key forms of research include:

    • Literature reviews: synthesize existing knowledge

    • Interviews and focus groups: gather in-depth qualitative insights

    • Public events and site visits: observe real-world contexts

    • Data sets: analyze quantitative information

    • Surveys: collect systematic responses

    • Case studies and “best practices”: examine concrete examples and transferable lessons

Searching for Past “Solutions” (Bardach’s approach)

  • Strategies to improve searches for prior literature:

    • Survey best practices from related policy areas to identify transferable ideas

    • Find, summarize, and explain how similar ideas could apply to your problem

    • Use analogies to bridge different contexts

    • Engage potential critics or policy opponents to anticipate objections and refine arguments

Assembling Evidence: Big Picture

  • Goals when assembling evidence:

    • Find, interpret, criticize, and synthesize existing evidence

    • Use existing evidence as a springboard to:

    • Briefly summarize what’s been tried before

    • Introduce new policy alternatives aimed at addressing the problem

    • Project likely outcomes of those alternatives on your criteria

What is meant by evidence-based?

  • Evidence-based policy relies on:

    • Validated forms of documented scientific research or findings, established through previously conducted research, not anecdotal evidence

    • Scientific evidence is collected to make informed decisions about a policy, program, or practice to address a social problem

Locating Relevant Sources: Bardach & Patashnik’s guiding ideas

  • Advice on where to look and how information flows:

    • People lead to people

    • People lead to documents

    • Documents lead to documents

    • Documents lead to people

    • Don’t be afraid to talk with knowledgeable people about the public problem you’re studying

Places to Start

  • APPAM: Association for Public Policy Analysis and Management

    • Premier professional organization for policy analysts, economists, sociologists, and other researchers with policy implications

    • Website: https://www.appam.org/

  • Top journals: Journal of Policy Analysis and Management (JPAM)

    • Website: https://www.appam.org/publications/jpam/ or https://onlinelibrary.wiley.com/journal/15206688

  • Other sources:

    • Good field journals by policy area

    • Research reports by think-tanks and government agencies

JPAM Search Example: Food Insecurity and SNAP

  • JPAM search results example:

    • Food insecurity: 69 articles & chapters

    • SNAP: 115 articles & chapters

  • Reflective questions:

    • What other filters might be important to use?

    • What should you prioritize in your search?

Awareness of Biases in Searching for Evidence

  • Important bias to recognize: political bias

    • Be aware of how your own views might shape where/how you search for evidence on past policies or related problems

    • When brainstorming policy alternatives for your policy brief, strive for a range of options based on the evidence rather than personal opinion

Interpretation of Results: Typical positions

  • Centrists: favor selective government intervention and practical solutions; open to new issues; government as a check on excessive liberty

  • Libertarians: self-governance in personal and economic matters; government’s purpose is to protect people from coercion and violence; value individual responsibility and tolerate diversity

  • Left-Liberals: prefer self-government in personal matters; central decision-making on economics; government to serve the disadvantaged for fairness

  • Leftists: tolerate social diversity; seek economic equality

  • Right-conservatives: self-government on economic issues; prefer official standards in personal matters; want government to defend community morality

  • Authoritarians: support expert central planning to advance society and individuals; skeptical of full self-government

  • Left-authoritarians (socialists) vs. Right-authoritarians (fascists): different ends of the authoritarian spectrum

Causal/Explanatory Research: Core idea

  • Explanatory (causal) research aims to identify cause-and-effect relationships: x → y

  • Critical pieces of causality:

    • Temporal sequence: appropriate causal order of events

    • Non-zero correlation: two phenomena vary together

    • Nonspurious association: absence of alternative, plausible explanations

Example: Temporal sequence and causation

  • Example structure:

    • Independent variable: Policing measures (e.g., X)

    • Dependent variable: Arrests (e.g., Y)

    • Temporal sequencing: an increase in policing leads to an increase in arrests, illustrating a cause-effect chain

  • Visualization of a basic causal chain:

    • X
      ightarrow Y with temporal order established

Correlation vs. Causation: Types

  • Zero correlation: no systematic relationship between variables

  • Positive correlation: as one variable increases, the other tends to increase

    • Example: ext{Number of arrests}
      ightarrow ext{Policing intensity} (positive trend)

  • Negative correlation: as one variable increases, the other tends to decrease

    • Example: ext{Policing intensity}
      ightarrow ext{Arrest rates (may decrease due to deterrence or displacement)} (illustrative)

Non-Spurious Association: Illusion vs. reality

  • Spurious example: Ice cream sales and drowning incidents may correlate due to a lurking factor (season) – not a causal link

  • Key idea: Probe for alternative explanations and confounders to establish a true causal link

Purpose of Research: Explanatory vs Descriptive

  • Explanatory (causal) research vs Descriptive research

    • Descriptive: answers what, where, when, and how

    • Explanatory: answers why; important for policy because understanding causes informs effective interventions

Experimental Design: How to establish causality

  • Experimental group vs. control group

    • Experimental group: receives treatment (stimulus)

    • Control group: does not receive treatment

    • Outcome measure: compare dependent variable across groups

    • Quantitative comparison: E[Y|T=1] - E[Y|T=0] (difference-in-outcomes)

  • The experimental process aims to isolate the effect of the treatment on outcomes

Ethics and Experiments: Key considerations

  • Deception is common in some experiments but raises ethical concerns

  • Debriefing: inform participants after the experiment to restore their normal state

  • Potential harms: experiments can cause physical or psychological damage; must minimize risk

Ethical Safeguards in Experimental Research

  • Ethical checks for participants:

    • Informed consent: subject’s voluntary participation

    • Assessment of potential harm: physical/psychological trauma risks

    • Ability to restore baseline state post-participation

Validity Issues in Experimental Research

  • Internal validity: did the treatment cause the observed changes, or were there other factors?

    • Common threat: confounding variables, selection bias, measurement error

  • External validity: can results generalize to real-world settings beyond the study sample?

  • Validity (measurement validity): does the instrument measure what it intends to measure?

Natural Experiments: An alternative approach

  • Natural experiments use naturally occurring events to approximate random assignment

    • Treatment dictated by external forces or events outside participants’ control

    • Trade-off: higher external validity concerns; still risks bias if not truly random

  • Not a panacea: external validity issues persist; not all natural experiments are truly random

Case Examples of Natural Experiments (Illustrative Discourse in Literature)

  • Natural experiments in the aftermath of disasters and trauma provide quasi-experimental settings to study policy impacts:

    • The September 11, 2001 attacks and alcohol consumption/distress among residents far from the epicenter

    • Katrina hurricane impacts on income, employment, and geographic mobility

    • Household finance after natural disasters: credit, debt, and mortgage behavior

    • The Mariel Boatlift (1980): Miami labor market response to a large immigrant influx

    • Mortality shocks and fertility responses after disasters (e.g., tsunamis)

    • Card (1995) and Krueger (1994) classic studies on labor markets and policy changes

Selected Readings and Contextual References (typical examples in the field)

  • Perrine et al. (American Economic Journal: Applied Economics, 2018) – The impact of a national trauma on alcohol consumption and distress

  • Deryugina, Kawano, and Levitt – The economic impact of Hurricane Katrina on victims; long-term income and relocation effects

  • Gallagher and Hartley – Household finance after natural disasters; debt dynamics and mortgage behavior

  • Card (1990s) – The Mariel Boatlift and labor market adjustments

  • Krueger & Card (1994) – Minimum wages and employment in fast food (example of natural experimental methodology in economics)

Are Natural Experiments the Cure-All? (Limitations)

  • Not truly random in many cases; potential for bias remains

  • External validity concerns: results from extreme or unusual events may not generalize to typical policy settings

  • Trade-off reality: internal validity (causal inference) vs. external validity (generalizability)

  • Final takeaway: natural experiments are valuable but not a universal solution; use judiciously and transparently discuss limitations


Connections to Foundational Principles and Real-World Relevance

  • Evidence-based policy relies on systematic gathering and critical appraisal of existing research, not anecdotes

  • Triangulation across multiple evidence sources (literature, data, expert input, case studies) strengthens policy recommendations

  • Ethical considerations are central to research design, especially when human subjects are involved

  • Understanding causal relationships helps design effective interventions rather than merely describing associations

  • Real-world relevance: the material uses concrete examples (disasters, policy changes, immigration shocks) to illustrate how evidence-based methods inform policy decisions


Practical Implications for Your Policy Brief

  • Plan your evidence search with Bardach’s approach: start with related fields, build analogies, and test ideas against potential critics

  • Clearly distinguish descriptive findings from causal inferences; label limitations and assumptions

  • Use natural experiments where appropriate to infer causal effects, but acknowledge external validity constraints

  • Present a range of policy alternatives backed by evidence, not opinion, and assess them against clear criteria (efficacy, equity, feasibility, cost)

  • Be mindful of biases in search and interpretation; document your search strategies and selection criteria


Notation and Quick Recap (LaTeX-friendly)

  • Causal relation: X
    ightarrow Y

  • Difference-in-outcomes (treatment effect): E[Y|T=1] - E[Y|T=0]

  • Temporal sequence: ensure X\text{ occurs before }Y

  • Non-zero correlation: \rho_{XY} \neq 0

  • Nonspurious association: no confounding variable Z that explains the observed relation

  • Internal validity: focus on whether the observed effect is truly due to the treatment

  • External validity: applicability of results to other settings

  • Measurement validity: the instrument measures the intended concept