ME

SRDC Program Evaluation and Organizational Psychology — Comprehensive Notes

Overview

  • Podcast discussion in organizational psychology context featuring Doctor Ray (Sr. Research Associate at SRDC: Social Research and Demonstration Corporation).

  • Focus: program evaluation within organizations, overlaps with organizational psychology, and career/educational pathways from a practitioner’s perspective.

  • Emphasis on translating course material to real-world client work and the value of independent, nonprofit evaluation input.

About SRDC and the Role of a Program Evaluator

  • SRDC is a nonprofit research and evaluation organization.

  • Distinctive position: independent voice on projects, which can provide credibility in fields where industry partners may be biased.

  • SRDC’s mandate is linked to social justice and equitable society outcomes.

  • Jennifer (Dr. Ray) joined SRDC in 2019; previously evaluated at CAMH (Center for Addiction and Mental Health) and was a PhD student in psychology at the University of Ottawa.

  • Work scope at SRDC includes homelessness and housing as a continuing focus, but also broader topics beyond those areas.

  • Projects at SRDC range from short-term (as short as 3 months) to long-term (as long as 3 years). This mix supports collaboration with new teams on varied projects and emphasizes goal-oriented, team-based work.

How Program Evaluation Fits with Organizational Psychology

  • Program evaluators assess and improve programs within organizations, often evaluating structures, processes, and outcomes to help organizations meet their objectives.

  • Overlap with organizational psychology: focus on team dynamics, organizational processes, implementation fidelity, and practical outcomes that affect people in organizations.

  • Evaluations emphasize credible, evidence-based recommendations that clients can act on to improve programs and organizational functioning.

  • Value in understanding how research translates into practice, not just describing outcomes.

Pathways and Background: How Evaluation Skills Translate from Psychology Training

  • Core research skills (from experimental psychology): ask questions, collect data systematically, analyze data, and share findings.

  • Distinction: program evaluation is action-oriented with an audience seeking answers to specific questions, increasing motivation and relevance of the work.

  • Hands-on, practical experience enhances learning: community service learning, organizational placements, and internships.

  • Early exposure: Alliance to End Homelessness in Ottawa (community service learning) and internship at the Canadian Mental Health Association shaped the practical application of evaluation methods.

  • Supervisor influence: mentors with clinical psychology backgrounds promoted the value of program evaluation.

  • Experience at CAMH and SRDC allowed for a balance of research and evaluation work, with evaluation offering more collaborative, hands-on team work and stakeholder engagement.

Core Skills in Program Evaluation

  • Research-to-practice orientation: translating findings into actionable recommendations for clients.

  • Qualitative strengths: valued for adding depth and voice of stakeholders, especially in community research and participatory approaches.

  • Quantitative methods: used when appropriate; combine with qualitative approaches for a robust evidence base.

  • Stakeholder engagement: essential for buy-in, relevance, and credibility; aim for shared ownership and collaboration.

  • Independence and objectivity: as an outsider evaluator, can provide confidential space for stakeholders to share candid insights.

  • Data legacy and dashboards: aim to leverage existing data (legacy data) and help clients improve data collection, dashboards, and ongoing monitoring.

  • Communication skills: translating findings into digestible, non-threatening formats; emphasize learning and action over blame.

  • Capacity building: focus on training and empowering organizations so they can continue evaluation and evidence-informed practice after the engagement ends.

Typical Evaluation Questions and Planning

  • Initial planning includes a site visit to understand the setting, program delivery, and context.

  • Identify all stakeholders: clients/participants, funders, frontline staff, managers, and other partners.

  • Compare what was intended with what is happening in practice and clarify what the client wants to learn.

  • Develop a small set of evaluation questions: typically 3 to 5 questions agreed upon by the group; they guide data collection and analysis.

  • Example of implementation vs outcomes focus:

    • Implementation questions: e.g., Are you actually reaching the intended participants? Are program activities delivered as planned?

    • Outcomes questions: e.g., Are participants experiencing intended benefits? Are short-term changes translating into long-term impact?

  • Practical example in housing programs: assess whether participants facing high barriers (e.g., justice involvement) are being reached and supported effectively; measure intermediate outcomes like avoidance of new charges during program participation.

  • Concept of “story chapters”: select which aspects of the program story to tell, acknowledging that not all data can be covered.

  • Collaboration and consensus are crucial in setting questions and scope.

Methods: Data Collection and Right-Sizing Evidence

  • Mixed methods approach: use qualitative and quantitative tools to gather a comprehensive picture.

  • Qualitative data sources:

    • Interviews with clients, staff, and partners.

    • Site visits and observations of the intake and program delivery processes.

    • Group activities and focus groups when appropriate.

  • Quantitative data sources:

    • Existing program data, survey results, and other measurable indicators.

    • New data collection when needed to fill gaps or strengthen evidence bases.

  • Emphasis on independence and confidentiality to encourage honest feedback and richer data from participants.

  • “Legacy tools”: assess what data the organization already collects; recommend improvements and help build a durable data collection system (e.g., dashboards).

  • Iterative data collection: revisit data and refine questions as understanding deepens.

  • Data analysis and synthesis:

    • Integrate qualitative themes with quantitative findings to tell a coherent story.

    • Focus on actionable insights and practical implications for program improvement.

  • Framing findings for action: present results in a nonthreatening way, emphasize learning opportunities, and outline concrete steps for improvement.

  • Sharing plan: determine what to share internally (organization) and externally (sector, funders) based on credibility and relevance.

The People Side: Stakeholders, Trust, and Collaboration

  • Building trust is essential and time-consuming; rushing collaboration can lead to resistance or misalignment.

  • Stakeholder alignment helps ensure that findings are credible and used for change.

  • Participatory and collaborative research practices help ensure buy-in and increase the likelihood of implementation of recommendations.

  • Consultant role includes understanding power dynamics, voice distribution, and frontline realities; ensure frontline staff are heard and represented in the evaluation process.

  • Trauma-informed practice considerations: staff wellbeing and boundaries influence the ability to deliver trauma-informed services to clients; burnout and mental health of staff are important evaluation foci because they affect program delivery.

  • Leadership connection: leaders should stay connected to frontline realities to maintain program fidelity and responsiveness.

Programs and Projects: Areas of Focus and Examples

  • Indigenous community skills training programs: upcoming site visits to observe intake, team dynamics, and graduation events; engage with community context and consider long-term impacts.

  • Transitional housing programs for justice-involved individuals: evaluate outcomes such as recidivism and housing stability while considering intermediate indicators like avoiding new charges.

  • Youth mentoring programs: another current evaluation focus.

  • Cross-cutting themes across projects: applicability to various settings, including homelessness, housing, youth, and community services.

  • Common program features evaluated:

    • Team dynamics, turnover, and organizational structure.

    • Training quality and staff readiness to deliver activities as planned.

    • Partnerships and referrals to external organizations, and how well these collaborations support client outcomes.

Organization and Practice: Link to Organizational Psychology Concepts

  • Organizational principles frequently arise in evaluations: team dynamics, turnover, absenteeism, leadership engagement, and alignment between stated values and frontline practices.

  • Drift between values and practice is a recurring theme: organizations may need to adjust practices or explicitly redefine the program to maintain alignment.

  • Trauma-informed practice is both a program-level and team-level concern; effective implementation requires a trauma-informed culture across staff and leadership.

  • The role of the outsider evaluator in surfacing questions and perspectives not readily visible to staff within the organization.

  • Recognizing that implementation challenges (e.g., staffing, leadership turnover, inter-organizational partnerships) are often root causes of suboptimal outcomes.

Practical Implications and Common Recommendations

  • Value-practice alignment: assess where there is drift between intended program principles and actual practice; decide whether to adapt practices or redefine the program to align with realities.

  • Focus on capacity building: emphasize learning and building organizational capacity so programs can continue effectively after the evaluation ends.

  • Address staff well-being: burnout and mental health of frontline workers influence program delivery; recommendations may include mitigations and supportive practices.

  • Trauma-informed leadership and operations: ensure leadership and teams practice trauma-informed approaches to enable better client outcomes.

  • Use a collaborative, trust-building approach: avoid an adversarial dynamic; foster a sense of shared ownership of the evaluation and its outcomes.

  • Deliver findings in accessible formats: present results in a way that is easy to understand, actionable, and framed as learning opportunities rather than criticisms.

  • Plan for legacy and ongoing use of data: help organizations implement dashboards and data collection practices that persist beyond the project.

Student Guidance: Skills to Build During Undergrad and Early Graduate Work

  • Prioritize hands-on experiences: seek real-world placements in organizations; community service learning; internships; volunteering in relevant settings (e.g., shelters).

  • Develop transferable skills:

    • Project management and work planning.

    • Group facilitation and leadership in group contexts.

    • Understanding human behavior within organizations and how plans translate into practice.

    • Adult learning principles and effective knowledge transfer for capacity building.

  • Emphasize collaborative learning and case-based challenges:

    • Participating in case competitions or evaluation-themed activities helps simulate real-world problem solving under time pressure.

    • Case competition example: a Canadian Evaluation Society event where teams produce an evaluation plan in a single day (about 5 hours).

  • Build comfort with both qualitative and quantitative methods; gain experience in interviewing, observation, and analysis; learn to triangulate data sources.

  • Seek opportunities to learn about organizational structures and roles; understand partnerships and networked services in community settings.

  • Balance theory with practice: use internships and service-learning to connect classroom concepts with frontline realities.

Personal and Professional Reflections on the Work

  • Meeting people, hearing diverse perspectives, and learning about different roles are valued aspects of the work.

  • Travel and work in diverse communities (e.g., Inuit communities) enrich perspective and improve future project work.

  • Outsider perspective can surface important questions and prompt new insights for project teams.

  • Long-term learning: cumulative experiences build a broader contextual understanding that strengthens future evaluations and organizational work.

Case Competitions and Lab Experiences (Mentions for Students)

  • Case competition described: Canadian Evaluation Society event for evaluation students.

  • Structure: teams with a coach; one day to create an evaluation plan from a confidential case scenario; multistep collaboration and time pressure; templates and tools help pace the process.

  • Experience is valuable for building practical evaluation skills beyond coursework; often used by instructors and later as a teaching/mentoring activity.

Final Takeaways

  • Program evaluation sits at the intersection of research, organizational psychology, and practical change management.

  • The most impactful evaluations blend rigorous data with stakeholder engagement, trust-building, and a focus on learning and action.

  • Skills to cultivate include project management, facilitation, qualitative methods, data-informed decision making, and capacity-building approaches.

  • Real-world exposure (placements, internships, volunteer work) is essential for developing the practical instincts needed in evaluation roles.

  • A thoughtful, collaborative approach with clear questions and a plan for data collection and reporting increases the likelihood that findings will be used to improve programs and organizational practices.

Notes on Scope and Next Steps

  • Jennifer highlighted ongoing and upcoming projects, including Indigenous community programs and housing initiatives, illustrating the breadth of program evaluation applications.

  • Students should consider pursuing hands-on opportunities and case-based learning to prepare for graduate studies or careers in program evaluation, organizational psychology, and related fields.

  • The discussion ends with appreciation for the learning exchange and notes that another podcast episode was planned.

3 ext{ months} \, \le \, \text{duration} \, \le \, 3 ext{ years}
3 \le \text{evaluation questions} \le 5
4 \text{ weeks}
3 \text{ site visits}
5 ext{ hours}
\text{Inputs} \rightarrow \text{Activities} \rightarrow \text{Outputs} \rightarrow \text{Outcomes}
\text{Program outcome dependency: } \text{Outcome} = f(\text{Inputs}, \text{Activities}, \text{Outputs})