1/55
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
qualitativedataanalysis
It is the process of organizing, understanding, and interpreting non-numerical data (like interviews or written responses).
qualitativedataanalysis
The goal is to find patterns, meanings, and themes from what participants said or did.
Interview transcripts
Focus group discussions
Observation notes
Open-ended survey answers
Journals or diaries
Social media posts or pictures
TYPES OF QUALITATIVE DATA
Content Analysis
is a systematic and objective approach to analyzing data by categorizing, coding, and quantifying specific words, themes, or concepts within a text.
Narrativeanalysis
focuses on interpreting and understanding the stories and personal narratives shared by individuals.
Narrativeanalysis
researchers analyze the structure, content, and meaning of these narratives to gain insights into how individuals make sense of their experiences, construct identities, and communicate their perspectives.
Discourseanalysis
examines the social, cultural, and power relations that shape language use in different contexts.
Discourseanalysis
It focuses on the ways in which language constructs and reflects social reality, identities, and ideologies.
Groundedtheory
is an approach to qualitative analysis that aims to develop theories and concepts grounded in data.
Groundedtheory
It involves iterative data collection and analysis to develop an inductive theory that emerges from the unstructured data itself.
Thematicanalysis
is a common qualitative research method that involves identifying and analyzing patterns or themes.
Thematicanalysis
It's a way to organize large volumes of text-based information into a coding framework like groups or themes.
1. Familiarization
2. Coding
3. Categorizing
5. Interpretation
STEPS IN QUALITATIVE DATA ANALYSIS
Coding
Example:
"I feel pressured because of my workload."
Codes: pressure, stress, workload
Categorizing
Example:
pressure + deadlines → "Academic demands"
anxiety + sleeplessness → "Emotional strain"
Reminders
- Ensure Trustworthiness
- Reflexivity
RELIABILITY
Ability of the test to give CONSISTENT RESULT
INCONSISTENT
NOT RELIABLE
FIRST TRY: 50 KG
SECOND TRY: 40 KG
THIRD TRY: 60 KG
NOT RELIABALE
IQ TEST:
FIRST ADMINISTRATION: AVERAGE
SECOND ADMINISTRATION: BELOW AVERAGE
THIRD ADMINISTRATION: ABOVE AVERAGE
Test-Retest Reliability
Alternate Form Reliability
Internal Consistency
Inter-Rater Reliability
METHODS IN OBTAINING RELIABILITY
-Split Half Reliability
-Cronbach's Alpha
-Kuder-Richardson Method (KR 20)
Internal Consistency
TEST-RETEST RELIABILITY
Used to evaluate the error associated with administering a test at two different times.
TEST-RETEST RELIABILITY
This type of analysis is of value only when we measure "TRAITS" or characteristics that do not change over time
ALTERNATE-FORM RELIABILITY
Compares two equivalent forms of a test that measure the same attribute.
INTERNALCONSISTENCY
Used when tests are administered once.
INTERNALCONSISTENCY
This model of reliability measures the internal consistency of the test which is the degree to which each test item measures the same construct.
INTERNALCONSISTENCY
If all items on a test measure the same construct, then it has a good internal consistency
Split Half Reliability
obtained by correlating two pairs of scores obtained from equivalent halves of a single test administered once.
INTERNAL CONSISTENCY: CRONBACH'S ALPHA
Use in tests with no right or wrong answers
INTERNAL CONSISTENCY: CRONBACH'S ALPHA
Used in personality tests and multiple-scored items.
INTERNAL CONSISTENCY: KUDER RICHARDSON 20 (KR20)
used for calculating the reliability of a test in which the items are dichotomous or scored a 0 or I with varying level of difficulty
INTERRATER RELIABILITY
individuals are assessed independently by observers or raters who make use of rating scales agreed beforehand.
INTERRATER RELIABILITY
Sufficient training is needed.
Scorer reliability, judge reliability, observer reliability.
KAPPA STATISTIC
INTERRATER RELIABILITY
KAPPA STATISTIC
METHOD FOR ASSESSING THE LEVEL OF AGREEMENT AMONG SEVERAL OBSERVERS
Cohen's kappa
used to know the agreement among two raters
Cohen's kappa
used to know the agreement among 3 or more raters.
VALIDITY
Is the degree to which the measurement process measures the variable that it claims to measure.
VALIDITY
To predict thinking/behaviour
VALIDITY
Extent to which the test is taken seriously by test-takers
CONSTRUCT IDENTIFICATION PROCEDURES
Convergent Validity
Discriminant Validity
Is the test measuring what it claims to measure?
CRITERION PREDICTION PROCEDURE
Predictive Validity
Concurrent Validity
Can a test predict future thinking and Dehaviour?
CONTENT DESCRIPTION PROCEDURES
Content Validity
Face Validity
Will the participants take the test seriously? is the test presentable?
Convergent Validity
correlate test to another established test (related or same construct)
Convergent Validity
Step 1: Find established test that is strongly related to your test's construct
Step 2: Administer the two tests (established test and your test) to the sample
Step 3: Correlate
Convergent Validity
Possible Outcomes:
1. Correlation coefficient is strong and positive - VALID
ii. Correlation coefficient is negative and strong-INVALID
iii. Correlation coefficient is weak-INVALID
Discriminant Validity
correlate test to unrelated construct
Discriminant Validity
Step 1: Find another established unrelated test
Step 2: Administer both tests
Step 3: Correlate
Discriminant Validity
Possible Outcomes:
i. Correlation coefficient is strong and positive - INVALID
ii. Correlation coefficient is negative and strong-INVALID
iii. Correlation coefficient is weak - VALID
Predictive Validity
test the ability of the test to predict a future criterion,
Predictive Validity
Step 1: Administer the test.
Step 2: Get results
Step 3: Select future appropriate criterion-concrete numerical expression/observable indicators of a test construct
Example: Test construct-Intelligence
Criterion-School GPA, IQ score
Step 4: Wait for the future criterion
Step 5: Correlate
Concurrent Validity
testing the ability of the test to predict using an already existing criterion.
Concurrent Validity
Step 1: Administer the test
Step 2: Get results
Step 3: Select existing criterion
Step 4: Correlate
Content Validity
examines appropriateness of test items of a psychological test. Do the items belong to the test or not? Are all the items in the test supposed to be in the test?
Face Validity
presentation or physical appearance of the psychological test. Is this test presentable to the test takers?
Face Validity
1. The test looks like a test.
ii. Age appropriateness (kids vs adults)
-Design
-Language difficulty
ii. Free from grammatical errors