APSACS SLO-Based Assessment Facilitation Booklet — Comprehensive Study Notes

Background & Objectives

  • APSACS (Army Public Schools & Colleges System) recognised long-standing issues with teacher-made tests that rewarded rote recall rather than higher cognition.
  • Booklet answers Sustainable Development Goal 4 (SDG-4) by shifting the system to skill-based, SLO-driven assessment.
  • Core purposes
    • Standardise assessment quality across all campuses.
    • Provide actionable feedback that improves instruction, curriculum and learner support.
    • Empower teachers to:
      • Identify cognitive level with Bloom’s Taxonomy
      • Craft questions for each level
      • Weigh strengths/weaknesses of objective vs. subjective items
      • Embed frequent classroom assessments
      • Build fair rubrics & give timely feedback.

APSACS Assessment Framework

  • Assessment = third pillar of the instructional cycle (after Planning & Delivery).
  • Three-pronged function
    1. Monitor – track progress vs. learning objectives.
    2. Interpret – provide data that prompt instructional adjustments.
    3. Enhance – feed learners with constructive feedback for self-improvement.
  • Requires valid, reliable and actionable evidence of SLO mastery in authentic contexts.

Student Learning Outcomes (SLOs)

  • Clear statements of the knowledge (cognitive), skills (psychomotor) and attitudes (affective) learners will demonstrate after instruction.
  • Two-part syntax ≈ verb + content (e.g., “Analyse causes of evaporation”).

Bloom’s Taxonomy – Core Theory

  • Developed to create a common language for test design.
  • APSACS aligns all assessments and SLOs to Bloom’s hierarchy.

Cognitive Domain (Thinking)

  1. Remember – recall facts.
  2. Understand – explain, summarise.
  3. Apply – use in new situations.
  4. Analyse – break apart & examine relationships.
  5. Evaluate – judge value with criteria.
  6. Create – generate original product/idea.

Affective Domain (Feeling)

  • Levels: Receiving → Responding → Valuing → Organising → Characterising.
  • Assessed via observations, peer feedback, role-play, service learning, portfolios.

Psychomotor Domain (Doing)

  • Levels: Imitation → Manipulation → Precision → Articulation → Naturalisation.
  • Assessed via practicals, demonstrations & continuous performance logs.

Command Words (By Cognitive Level)

Knowledge

  • Define, list, state, mention …

Understanding

  • Classify, compare/contrast, describe, differentiate, discuss, elaborate, explain, identify, paraphrase …

Application & Higher Order Skills

  • Analyse, apply, assess, calculate, construct, create, deduce, evaluate, formulate, illustrate, infer, justify, predict, recommend, solve, synthesise, verify …

Blooming Applications (Digital Tools)

  • Formative tech suggestions
    • Google Forms (quizzes)
    • Jamboard (interactive boards)
    • Kahoot (gamified quiz)
    • Padlet (exit tickets)
    • Flip Grid (peer video feedback)
    • RubiStar (digital rubrics)
    • Vocaroo (audio responses)
    • SeeSaw (e-portfolios)

Additional Verbs Across Levels

  • Remembering: recall, locate, outline, match …
  • Understanding: interpret, translate, summarize …
  • Applying: implement, demonstrate, operate …
  • Analysing: categorise, deconstruct, probe …
  • Evaluating: appraise, defend, critique …
  • Creating: devise, plan, modify …

Assessment Development

Qualities of a Good Test

  • Reliability – consistent scores.
  • Validity – measures intended SLO.
  • Objectivity – impartial marking.
  • Clarity & Coverage – unambiguous items spanning curriculum.
  • Practicality – feasible within time/resources.

Alignment Procedure

  1. Identify SLO’s cognitive level.
  2. Choose a verb that matches Bloom level.
  3. Write an item whose task mirrors that verb ( \text{SLO} \rightarrow \text{Verb} \rightarrow \text{Question} ).

Equitable Content Spread

  • Map each chapter/unit against marks & cognitive levels → build a matrix ensuring proportional representation.
  • Prioritise higher-order questions; discourage over-use of recall.

Objective-Item DOs & DON’Ts

DO

  • Single correct answer, uniform stem length, simple language, SLO-based.
    DON’T
  • Tricky wording, textbook verbatim, “all/none of the above”, grammatical clues.

Subjective-Item Principles

  • Require constructed response; allow multiple valid answers; weightage & rubric disclosed; target higher cognition.

Bad vs. Good Item Examples (condensed)

  • “What is a cell?” → mere recall.
  • Improved: “Explain how cells maintain human health.”

Rubric Development

Core Principles

  • Connect directly to learning objectives.
  • Observable language, student-friendly vocabulary.
  • Three clear performance levels recommended (Excellent / Average / Unsatisfactory).

5-Step Creation Process

  1. Decide what the rubric will assess.
  2. Select essential criteria.
  3. Decide scale (e.g., 3 levels).
  4. Describe work at each level (start with highest and lowest).
  5. Pilot, revise for clarity.

Pitfalls to Avoid

  • Over-complexity, vague terms, unrealistic expectations, uneven level gaps, off-target criteria, one-rubric-fits-all.

Benefits

  • Teachers: consistency, efficiency, targeted feedback.
  • Students: transparency, self-assessment, growth.
  • Parents: objective explanation of grades.

Sample Assessment Blueprints (Illustrative)

  • English Grade III comprehension (10 marks): MCQs, short answer, sentence completion, matching + rubric grid.
  • Math Grade V fractions: matching, ordering, word-problem with worked-marks rubric.
  • Science Grade IV animal structure: list, explain, classify + analytic rubric.
  • ICT Grade VI practical: open file, identify interface, data entry, save, edit with step rubric.

Independent Work (IW)

  • Definition: Classroom task performed individually, after guided practice, to evidence autonomous application.
  • Benefits: self-regulation, ownership, transfer, prep for higher studies.

Conduct Guidelines (All Grades)

  • Assign only after concept clarity.
  • Must differ from routine practice.
  • Teacher shares success criteria upfront.
  • No help from peers/teacher during IW.

Acceptable IW Task Types

  • Reading comprehension, creative writing, literature response, subject-specific application.

Rubrics

Class IV-V (5 marks)
1 Follows instructions, 1 Concept understanding, 1 Accuracy, 1 Effort, 1 Language.

Class VI-VIII (10 marks)
2 Format, 2 Concept application, 2 Relevance, 2 Creativity, 2 Language.

Missed Assessment Protocol

  • On return, student completes alternate IW in class; scores recorded under “Missed Assessment”.

Checklist for Assessment Designers (for Section Heads / Subject Coordinators)

  • Alignment with SLOs✔️
  • Correct verb & cognitive level✔️
  • Error-free & bias-free language✔️
  • Targets concept, not trivia✔️
  • Age-appropriate & clear✔️
  • Marking proportionate✔️
  • Sub-parts guide required answer✔️
  • Rubric rewards achievement vs. penalising error✔️

References (Key)

  • National Assessment Framework – IBCC.
  • Command Words – AKU-EB.
  • FBISE SLO Exam Guidelines.
  • Bloom, Anderson & Krathwohl; Marzano & Kendall New Taxonomy.
  • International Baccalaureate MYP Command Terms.
  • Digital Bloom’s Taxonomy (Anderson, 2007 wiki).