Chap 04 - Data Collection Techniques and Research Designs

Chapter Overview

  • Focus on how psychologists apply the scientific method through data collection techniques and research designs.

Data Collection Techniques

  • Measurement of Variables

    • A challenge in psychology; requires careful operational definitions.

    • Operational Definition:

      • Specifies how a variable was measured, making abstract concepts concrete.

      • Examples include:

        • Motivation: how a person’s drive is gauged.

        • Racial Bias/Discrimination: measurable differences in behavior towards various races.

Addressing Challenges in Measurement

  • Importance of precise definitions to ensure reliable and valid results.

Validity and Reliability

  • Validity:

    • Assesses whether a study measures what it intends to measure.

    • Types of Validity:

      • Internal Validity:

        • The degree to which a study accurately tests a hypothesis and rules out alternates.

      • External Validity:

        • Confidence that results relate to real-world behaviors outside the study.

  • Reliability:

    • Consistency of observations and measurements.

Naturalistic Observation

  • A technique that involves observing subjects in their natural environment without interference.

  • Operational Definition in Naturalistic Observation:

    • Develop a coding scheme based on how the variable is defined in context.

  • Advantages:

    • Provides realistic insights into behavior.

  • Disadvantages:

    • Challenges in remaining unobtrusive can affect the accuracy of observations.

Surveys and Questionnaires

  • Purpose:

    • Measure moods, attitudes, and behaviors through self-reports.

  • Types of Questions:

    • Close-Ended Questions:

      • Provide quantitative data and limited response options.

    • Example:

      • PCL-5 survey for post-traumatic stress assessment with scale ratings.

    • Open-Ended Questions:

      • Allow for qualitative responses; require coding for analysis.

      • Example questions include those about emotions and personal heroes.

  • Potential Issues:

    • Data may not reflect actual behaviors due to social desirability bias and self-report inaccuracies.

Interviews

  • Types of Interviews:

    • Oral responses gathered through structured or semi-structured formats.

  • Advantages:

    • Flexibility and depth of data collection.

  • Disadvantages:

    • Time-consuming and potential for interviewer bias.

  • Focus Group Interviews:

    • Conducted in groups to gather diverse views more efficiently.

Systematic Observation

  • Definition:

    • Involves observing and recording behaviors as they naturally occur, enhancing internal validity.

Archival Data and Content Analysis

  • Archival Data:

    • Utilizes existing data with no control over original measures or context; can reveal trends (e.g., song lyrics).

  • Content Analysis:

    • Analyzes written or spoken material for patterns, language use, and gender differences without direct interaction.

Validity and Reliability of Surveys

  • Construct Validity:

    • Measures whether a survey assesses the intended variable.

  • Reliability Analysis:

    • Consistency evaluated through test-retest reliability and internal consistency.

Research Design Types

  1. Case Studies

    • In-depth exploration of rare behaviors with 1-3 participants; findings may not generalize.

  2. Correlational Studies

    • Assess the relationship between behaviors; correlation does not imply causation.

  3. Experiments

    • Investigate cause-effect relationships by manipulating independent variables and observing effects on dependent variables.

  4. Quasi-Experiments

    • Used where IV cannot be manipulated; often based on existing group characteristics.

Conclusion and Lab Practice

  • Upcoming lab practice will focus on identifying different data collection techniques and research designs.

robot