thesis

0.0(0)
studied byStudied by 7 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/28

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

29 Terms

1
New cards
  1. Now that your study is completed, how did the actual implementation validate or challenge your original assumptions?

Our initial assumption—that students were aware of energy policies through AFYOP—was only partially validated. While awareness was moderate to high, actual behavioral adherence was lower, showing a gap between knowledge and practice. This confirmed that awareness alone isn't enough—nudges must also target ability and motivation, as the FBM predicts.

2
New cards
  1. What unexpected challenges did you encounter when translating your theoretical framework into actual data collection?

One challenge was operationalizing abstract FBM variables into measurable survey indicators. Students also tended to overreport "green" behaviors due to social desirability bias. We mitigated this by cross-verifying awareness and adherence responses for consistency.

3
New cards
  1. Why did you decide to keep the focus on AdDU instead of expanding to other universities after your proposal defense?

We wanted to maintain internal validity and control for institutional culture, since AdDU's sustainability policies are already structured and integrated into student life. Expanding outward risked introducing policy heterogeneity. Focusing on one campus allowed a more precise behavioral analysis.

4
New cards
  1. How did your findings strengthen AdDU's sustainability mission or provide feedback to institutional policy?

Our results identified which nudges—particularly priming cues—were most behaviorally effective, allowing policy refinement without additional financial burden. The findings provide evidence-based feedback to Ecoteneo and the PPO for improving campaign placements and visibility. It reinforces that behavioral design matters as much as policy content.

5
New cards
  1. Looking back, do you think nudging is enough for behavior change in a university setting?

Nudges are a strong starting point but not sufficient alone. Long-term change requires institutional reinforcement, consistent prompts, and social modeling. Nudges work best when embedded within a broader culture of environmental accountability.

6
New cards
  1. How did you ensure the reliability and validity of your survey instrument?

We adapted indicators from validated FBM-based instruments and had them reviewed by behavioral and economics faculty. A pilot test helped refine ambiguous questions, and Cronbach's alpha confirmed internal consistency. This ensured both construct validity and statistical reliability.

7
New cards
  1. Why did you stick with a cross-sectional design instead of longitudinal tracking?

Given academic time constraints, longitudinal tracking was impractical. However, our model's inferential structure allowed predictive interpretation of behavioral patterns at one point in time. Future research could build on this with repeated measures to observe behavior over time.

8
New cards
  1. Can you explain how the logit model captured behavioral change?

The logit model estimated the probability that a student exhibits energy-saving behavior (1) given their motivation, ability, and prompt scores. Coefficients showed the strength of influence of each variable. The model's predictive accuracy demonstrated that institutional prompts significantly increase behavioral likelihood.

9
New cards
  1. How did you control for possible biases in self-reported behavior?

We included awareness and perception variables as control factors and used consistency checks between sections. Patterns of contradiction (e.g., high awareness but zero adherence) helped flag overreporting. We also anonymized responses to reduce social desirability bias.

10
New cards
  1. Why was stratified sampling still appropriate post-proposal?

It allowed proportional representation across AdDU clusters, preventing academic program bias. Maintaining the same sampling framework ensured continuity and comparability from the approved proposal to final execution. This preserved methodological rigor and balanced respondent diversity.

11
New cards
  1. Which factor among motivation, ability, and prompt had the strongest effect?

Prompts had the highest predictive power, particularly environmental cues and visible reminders. This validates the Fogg Model's principle that even with moderate motivation, timely prompts can trigger behavior. Essentially, the environment "nudged" more effectively than self-reported values did.

12
New cards
  1. Were there differences across schools or year levels?

Yes. Students from SEA and SAS showed higher adherence, possibly due to more exposure to sustainability content in coursework. Senior students also demonstrated slightly higher behavioral consistency, likely due to longer institutional immersion.

13
New cards
  1. How did the results compare between social norm and priming nudges?

Priming nudges outperformed social norm nudges in effect size and perceived influence. Visual cues like posters and reminders were more consistently internalized than peer-based behavioral comparisons. This suggests that physical environment design has more staying power than social influence alone.

14
New cards
  1. Did your regression model encounter any major statistical issues?

No significant multicollinearity or heteroskedasticity was detected. Model fit was within acceptable pseudo-R² bounds, and all key predictors were significant at the 0.05 level. This strengthens confidence in our results' robustness.

15
New cards
  1. What behavioral trend surprised you the most?

Students rated environmental campaigns as "important" yet rarely followed through with daily actions. This mismatch underscored how awareness campaigns need direct prompts to translate intention into practice. It reinforced that motivation without simplicity or reminders leads to behavioral inertia.

16
New cards
  1. How do your results inform actual policy improvement at AdDU?

They provide clear evidence that behaviorally informed cues—like sign placements and targeted campaigns—produce measurable behavioral outcomes. AdDU can integrate these findings by redesigning communication strategies and default settings in facilities. It's a policy-level justification for using behavioral insights in sustainability management.

17
New cards
  1. If you were to redo the study, what would you change?

We'd add a longitudinal component to track whether behavior persists over time and incorporate experimental manipulations of nudge exposure. This would strengthen causal inference and minimize self-report bias. It would also allow us to capture adaptation and "nudge fatigue."

18
New cards
  1. What broader implications do your results have beyond AdDU?

The findings can guide other universities or local government offices aiming to implement low-cost, behavior-based energy policies. They show that cultural framing and behavioral design can significantly enhance sustainability compliance. This aligns with SDG 7 and behavioral governance strategies.

19
New cards
  1. What was the strongest validation of your hypothesis?

Our regression confirmed that nudges significantly influence the likelihood of energy-saving behavior, rejecting the null hypothesis. Both social norm and priming interventions had distinct but positive effects. It quantitatively verified that behavioral design can drive measurable environmental impact.

20
New cards
  1. What's your main takeaway as researchers after completing this study?

Behavioral change isn't just about awareness—it's about design. Even small adjustments in context or messaging can shift collective habits. For us, it proved that sustainable development is as much about psychology and behavior as it is about economics and technology.

21
New cards

How did you ensure that your variables did not suffer from multicollinearity?

We ran a Variance Inflation Factor (VIF) test, and all values were below the threshold of 5. This confirmed that motivation, ability, and prompt were distinct constructs without strong interdependence. Hence, the regression coefficients remained interpretable and unbiased.

22
New cards

Did your results support the concept of “nudge fatigue”?

Slightly. Some respondents reported that repetitive visuals became less noticeable over time. It suggests that nudge effectiveness may plateau unless the cues are periodically refreshed or combined with motivational reinforcement.

23
New cards

. What behavioral trend stood out the most in your findings?

Students tended to overrate their own adherence to sustainable practices. This self-enhancement bias, common in environmental behavior studies, revealed a perception gap between what students believe they do and what they actually practice. It highlights the need for more objective behavioral monitoring.

24
New cards

How did motivation specifically influence energy-saving actions?

Motivation alone was not a strong standalone predictor, but it significantly interacted with prompts. Highly motivated students responded more consistently to visible cues. This shows that motivation amplifies, rather than replaces, the effects of environmental design.

25
New cards

Did income or expenditure levels correlate with adherence?

Not significantly. Energy-saving behaviors like turning off lights or avoiding unnecessary elevator use require little to no cost, so they are not strongly tied to income. This implies that behavioral change in this context is more psychological than economic.

26
New cards

. How did qualitative responses support your quantitative findings?

The qualitative data from open-ended survey items echoed our regression results—students frequently cited posters, reminders, and campaigns as the main influences on their actions. Comments also revealed that peer imitation and habit formation reinforced the nudges’ effects. This triangulation strengthens the internal validity of our findings.

27
New cards

How do you interpret the difference between policy-based and behavior-based efficacy ratings?

Behavior-based interventions (like visual cues) scored higher than policy memorandums alone. It suggests that passive policy communication is less impactful without contextual reinforcement. In short, visible action reminders outperform administrative directives in driving real behavior.

28
New cards

What did the results of that test show on the policies vs behavioral nudges?

The t-test indicated a significant difference (p < 0.05), showing that behavioral nudges received higher mean efficacy scores than policy measures. This suggests that subtle, behavior-based interventions resonate more with students than top-down institutional rules.

29
New cards

What test did you use to compare the two types of nudges?

We used a One-way ANOVA to test for significant mean differences between social-norm and priming nudges. Both variables were continuous and derived from separate clusters of survey items, so ANOVA was ideal for comparing their perceived efficacy.

Explore top flashcards

Shut Up
Updated 964d ago
flashcards Flashcards (65)
hbs 5.1
Updated 953d ago
flashcards Flashcards (61)
CSE111 - Indexes
Updated 385d ago
flashcards Flashcards (52)
The Knee
Updated 89d ago
flashcards Flashcards (33)
Shut Up
Updated 964d ago
flashcards Flashcards (65)
hbs 5.1
Updated 953d ago
flashcards Flashcards (61)
CSE111 - Indexes
Updated 385d ago
flashcards Flashcards (52)
The Knee
Updated 89d ago
flashcards Flashcards (33)