1/141
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Way of working in scientific research
Theory, Data Collection, Data Accumulation
Theory
Logic explanation or prediction
Data Collection
Observation in a systematic way
Data Accumulation
Comparing what is logically expected with what is actually observed
Principle of Falsification (Karl Popper)
All knowledge is uncertain, explanations are true until they‘re refuted, science is self-correcting, helps avoid confirmation bias
Confirmation bias
Scientist may overlook contradictory facts to what they believe
Spurious relationships
Not every correlation reflects a real causal relationship
Cycle of Research
Problem, observation, inductive research, axiom, theory, hypothesis, deductive research, testing
Inductive research (bottom-up)
Used when little is known, researchers observe real-world patterns, then derive general theories from them, from specific observations to broader theories
Deductive research (top down)
Used when theories already exist, researchers formulate hypotheses base on existing models and test them with empirical data
Scientific Theory
Interconnected, coherent system of premises which aim to describe, explain or predict certain phenomena
3 elements of theory and variables
Assumptions, model, hypotheses
Assumptions
Basic ideas about nature of mankind
Model
Variables and relations
Hypothesis
Testable prediction, predicts under which conditions the independent variable x will have a directed effect on the dependent variable Y
Two elements of problem definition
(Research) aim, (research) question
Hierarchy of Research Aims
Exploration, Descriptive, Explanatory, Testing, Diagnosing, Design or prescriptive research, Evaluation
Exploration
What happens in case x, how do people experience this event
Descriptive
What characteristics does phenomenon A have, how did this event develop over time
Explanatory
Why did this happen, under which conditions will event z occur more often
Testing
Did phenomenon x occur more often since date y
Diagnosing
Used to identify what went working or which factors contributed to success or failure
Design or prescriptive research
Used to develop a solution, proposal or improvement for a practical problem
Evaluation
Used to asses whether a policy, intervention or change has worked and how it should be judged
Research aim and main question
Main question must match the research aim, descriptive aim needs a descriptive question, explanatory aim needs and explanatory question
Common mistakes in main question
Doesnt fit research aim, asks too many things at once, too vague or imprecise
Operationalisation
Transition from theory to empirical research
Process of Operationalisation
Definition, Choosing Indicators, Determine the values of indication
Levels of measurement
Nominal, ordinal, ratio, interval
Nominal measurement
No ranking possible, nationality
Ordinal measurement
Values increase but not at equal intervals, Likert scale
Ratio Measurement
Values increase at equal intervals with a reference point, age
Interval measurement
Values increase at equal intervals without a reference point, temperature
Reliability
Consistency
Validity
Accuracy
Index
A sum of logically related measurements
Scale
Collection of items designed to measure a single construct while ensuring internal consistency
Sampling
Selection n from the population N for which your results need to be valid, can be done purposefully or at random
Stratified sampling
Based on proportion of characteristics in the population
Stepwise sampling
Select in different steps
Quota sampling
Fixed numbers or percentages
Snowball Sampling
Recruitment by asking previous respondents for names of potential new candidates
Self-selection
Units decide for themselves to be included or not
Pre-test post-test
2 measurements, before and after treatment
Cohort
One homogenous group is measured at different moments in time
Panel
Fixed group is measured at different intervals
Trend
Measurement at different times but with different groups at the time of measurement
Cross sectional
One measurement with groups from different ages to stimulate time effects
8 elements of research design
Problem Definition, theory, operationalisation, choice of strategy, sampling framework, validity and reliability, processing and analysis of data, reporting findings
Strategy
Overall set-up or design
Technique
Analysis of data, collected with a method
Method
Collection of data, within a strategy
Characteristics of Experiments
Controlled setting, small number of variables, strong focus on causality, mostly quantitative data
Survey
United for large scale studies, standardised measurements, mostly quantitative data, strong interest in statistical generalisation
Case Study
Focus on one case or a small number of cases in real world context, small number of units of analysis, large number of relevant variables
Desk Research
Uses existing data rather than collecting new empirical material, efficient and cost-effective
Biggest problem of desk research
Fitting existing data with the research question
Mixed method design
Combines qualitative and quantitative data collection and analysis with one broad study, provides breadth and depth on the same topic
Key elements of an experiment
Two groups (experimental and control group), randomisation, controlled environment, treatment, pre-test and post-test
Quasi experiment
When any of the key elements of an experiment are missing
Main benefits of classic experiment
Established causality, rules out other explanations, but limited number of variables
Experiments are best suited for
Hypothesis testing, theory development, limited number of variables
Types of experiment
Lab experiment, Artefactual experiment, survey experiment, field experiment, natural experiment
Lab experiments
Highly controlled setting, strong control yet feels artificial
Artefactual experiment
Lab-like but more realistic participants or setting
Survey experiment
Treatment built into a questionnaire
Field experiment
Real-world setting, less control but increased realism
Natural experiment
No creation of direct treatment, experimental stimulus
Methodological consideration
Strong on internal validity, less on ecological validity
Risks methodological consideration
Experimenter effects, observer bias, subject bias (Hawthorne effect)
Observation
Technique often used Within experimental and simulation settings
Options 1 observation
Hidden (invisible for units), participating (visible and interaction), open (visible, no interaction)
Options 2 observation
Unstructured (inductive), structured (schematic, deductive)
Advantages observations
Shows real behaviour, shows non-verbal behaviour
Risks observation
Selectivity (scheme), subjectivity (going native), interference/presence in the situation (validity issues)
Survey
Research strategy aimed at systematically collecting standardised information from a large no of units usually via questionnaires
Characteristics research strategy
Large scale study, often with random sampling enabling generalisation to the population, 2. standardised measurement, suited for opinions and attitudes
Questionnaire
structured list of mostly closed questions presented in a particular rider and standardised answering options, can be written or oral
Development of Questionnaires
Generation of a pool of items, pilot testing, administering questionnaire to respondents, data inspection and analysis, reporting respondents report
Items
Questions or propositions based on theory and operationalisation of variables
Good items
Unambiguous, clear, no suggestive or leading, no jargon, no double negatives
Threats of validity in questionnaires
Social desirability, non response, answer tendencies
Case study
Study of one or more cases in real life
Methodological consideration of case study
Researcher has no control over the situation, low number of units of study, high number variables
When to do a case study
Induction, as a pilot for a larger study, new phenomena, in situations with low number of unit, aim to describe and understand, as part of mixed method design
Selection of cases
Preferably based on theoretical grounds and combination of following choices
Methodological considerations in case studies
Selection of cases is crucial, triangulation strengthens reliability, subjective and bias risks
Different ways of working Qualitative research
Ethnography, thick description, ground theory, participatory axion research, sometimes emancipatory aim
Open (qualitative) interview
Only the first question is fixed
Semi-structured interview
Topic list with questions/topics to structure the interviews but all open questions
Structured interview
Like an oral questionnaire, often with closed questions
Focus groups
6-12 respondents, interviewed by the moderator, open or semi structured, group dynamics matter
Delphi Study
Experts as respondents, multiple rounds of semi-structured questions and reports, aimed towards consensus
Benefits of desk research
Efficient (reuse material and data), validity (unobtrusive research, no interference in empirical reality)
Risks of desk research
Systematic way of working is paramount, operationalisation problem
Methods for desk analysis
Content analysis, secondary analysis, meta analysis
Meta analysis
Purpose to recnstruct facts, opinions and discourses, can be done quantitatively or qualitatively
Secondary analysis
New statistical analysis of existing data, collected by other researchers or research institutes
Meta analysis
Collection of empirical data from previous research, enabling to study effects and questions on a larger scale than before
Data Analysis
Empirical material n which research conclusion is based