1/79
Flashcards from D293 Glossary Assessment and Learning Analytics
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Ipsative Assessment
Measures student learning against past knowledge or performance. It can help motivate learners by showing the distance progressed, even if the learner's performance is not yet at the level of an expert or some classmates.
Formative Assessment
Used to determine learning as we instruct and adjust content complexity and instruction.
Summative Assessment
Used to determine learning at the end in comparison to the objectives.
Diagnostic Assessment
Designed to test a learner’s knowledge BEFORE they begin an activity or lesson; often referred to as pre-assessment.
Direct Assessment
Used to evaluate a learner's understanding of a concept, achievement of a learning objective, or completion of a goal through direct evaluation of the learner's work.
Indirect Assessment
Does not look at learners' actual work but instead uses information gathered from other sources, such as attendance or time on task.
Competency-based Assessment
Focuses on skills more than knowledge; often an example of authentic assessment centering on learners applying skills and knowledge.
Comprehensive Assessment
Provides various ways for the instructor to monitor a learner's academic achievement and progress; includes benchmark, formative, summative, and diagnostic assessments.
Authentic Assessment
Involves the application of knowledge and skills in real-world situations, scenarios, or problems, creating a student-centered learning experience.
Criterion-referenced Assessment
Measures student learning based on a concrete learning standard, objective, or outcome.
Norm-referenced Assessment
Uses assessment score results to create a comparative score of how learners did relative to the scores of other learners (e.g., grading on a curve).
Reflection-focused Assessment
Allows the learner to look back and reflect on the learning experiences, promoting self-regulation of learning.
Project-based Assessment
A concrete way to assess the learner other than a test, often involving creative outlets; an example is asking students to create a website.
Psychomotor Domain
Focuses on physical skills; includes perception, set, guided response, mechanism, complex overt response, adaptation, and origination.
Affective Domain
Includes five areas of emotional response, categorized as simple to complex ways of processing feelings and attitude: receiving, responding, valuing, organizing, and characterizing.
Cognitive Domain
Develops six areas of intellectual skills that build sequentially from simple to complex behaviors: remembering, understanding, applying, analyzing, evaluating, and creating.
Descriptive Data Analytics
Can tell us “what happened”; the most common type of data used in schools, such as test scores, attendance records, and feedback surveys.
Diagnostic Data Analytics
Examines data or content to answer the question, “Why did it happen?”; characterized by techniques such as drill-down, data discovery, and data mining.
Predictive Data Analytics
Offers insights into “what is likely to happen”; uses the results of descriptive and diagnostic analytics to predict future trends.
Prescriptive Data Analytics
Analyzes analytics to determine “What should be done?”; relies not only on the quality of analytics but also the accuracy to ensure well-thought decisions.
Quantitative Analysis
Based on objective, numerical data and statistics; often used in descriptive analytics to determine what happened.
Qualitative Analysis
Based on non-numerical information, such as observations, reflections, and interviews; often used in diagnostic analytics to determine why something happened.
Social Network Analysis
The study of patterns or trends in relationships among groups of learners or between learners and instructors; used in predictive analytics to predict future interactions.
Data for Research
Used to gather new data and test new theories, collecting more data than necessary.
Data for Accountability
Used to evaluate, rate, or rank performance, collecting all recent and relevant data available.
Data for Improvement
Used to observe student performance to answer questions about the effectiveness of instruction, collecting “just enough” data.
Task-level Feedback
Tells how well the learner has performed a specific task; effective when distinguishing correct from incorrect answers but not effective over time.
Process-level Feedback
Specific to the processes used for tasks; challenges the student to form a deeper understanding of the learning and encourages them to construct meaning on their own.
Regulatory-level Feedback
Addresses the way that the learner examines and adjusts his actions toward his learning goal; encourages the learner to self-assess to a more critical degree.
Self-level Feedback
Focuses more on the person than on the work and is the least effective; typically builds a student up but gives them little information for improvement.
Actionable Feedback
Should help the individual understand what they need to do differently and how to do it; provides precise details about what was done well or what needs improvement.
Mastery-orientated feedback (UDL)
Should “guide learners toward mastery rather than a fixed notion of performance or compliance; towards successful long-term habits and learning practices”.
Multiple Means of Representation (UDL)
Offering information in more than one format (e.g., text, audio, video, hands-on learning).
Multiple Means of Action and Expression
Providing more than one way to interact with the material and to show what they’ve learned (e.g., pencil-and-paper test, oral presentation, group project).
Multiple Means of Engagement
Look for multiple ways to motivate students (e.g., letting kids make choices, giving them assignments that feel relevant).
Perceivable (POUR)
Means the user can identify content and interface elements by means of the senses; perceiving through sight, sound, touch, smell, or taste.
Operable (POUR)
Means that a user can successfully use controls, buttons, navigation, and other necessary interactive elements (e.g., clicking, tapping, swiping, using keyboard or voice commands).
Understandable (POUR)
Technology that is consistent in its presentation and format, predictable in its design and usage patterns, concise, multimodal, and appropriate to the audience in its voice and tone.
Robust (POUR)
IT that is standards-compliant and designed to function on all appropriate technologies, allowing users to choose the technology they use to interact with information.
Construct-validity Bias
Refers to whether a test accurately measures what it was designed to measure; can be affected by language skills rather than academic abilities.
Content-validity Bias
Occurs when the content of a test is comparatively more difficult for one group of students than for others due to unequal opportunity to learn or unfair scoring.
Predictive-validity Bias
Refers to a test’s accuracy in predicting how well a certain student group will perform in the future; an unbiased test predicts future performance equally well for all groups.
Intrinsic Load
Refers to the complexity of what you are learning, including the amount of new information and how it all interacts; an essential part of the learning task over which we don’t have control.
Germane Load
Refers to the effort needed to use memory and intelligence to process information into schemas; how we process new information into long-term memory.
Extraneous Load
Results when a learning experience is unnecessarily difficult or confusing, using up cognitive resources that learners could direct at the learning task; a result of poor learning design.
Data-ownership and Control
Institutions should be aware of issues around third-party sharing, especially since sharing might include student data; involves the question of who owns the data.
Transparency
Institutional transparency might best begin by making clear to students and to other stakeholders the purpose of learning analytics; relates primarily to how student data is collected, analyzed and used.
Consent
Consent to collect student data should be sought at the point of registration, including transparency and potentially with a later option to withdraw consent.
Validity and Reliable Data
The institution needs to ensure that data collected and analyzed is accurate and representative of the issue being measured; the results should be transparent and clearly understood.
Audio Feedback
Feedback that can be recorded in many learning management systems and left directly in assignments for learners to review; needs to be recorded at an appropriate pace and feature clear pronunciation.
Discussion Feedback
Instructors can offer feedback to the class as a whole or to individual students; allows instructors to encourage students to engage with the material and each other deeper.
Email Feedback
Written feedback that can be delivered to students via email, allowing them to access this feedback outside of their time spent in the university’s system.
Peer-feedback
Can occur actively in discussions, through content sharing aspects of learning management systems, and when group projects are assigned; students evaluate their peers' work.
Screen-sharing Feedback
Instructors can offer synchronous or asynchronous feedback while sharing screens with students; needs to be high quality.
Remember (Bloom's Taxonomy)
Retrieve relevant knowledge from long-term memory.
Understand (Bloom's Taxonomy)
Construct meaning from instructional messages, including oral, written and graphic communication.
Apply (Bloom's Taxonomy)
Carry out or use a procedure in a given situation.
Analyze (Bloom's Taxonomy)
Break material into foundational parts and determine how parts relate to one another and the overall structure or purpose.
Evaluate (Bloom's Taxonomy)
Make judgments based on criteria and standards.
Create (Bloom's Taxonomy)
Put elements together to form a coherent whole; reorganize into a new pattern or structure.
Criterion-referenced (Assessment Strategy)
Assessment to see if a learner has met predetermined milestones and requirements.
Ipsative (Assessment Strategy)
Assessment to see if a learner has improved based on previous knowledge.
Norm-referenced (Assessment Strategy)
Assessment to see how a learner's work compares to the average work completed by a similar group of learners.
Standards-based (Assessment Strategy)
Assessment to see if a learner can meet requirements and have a mastery of knowledge based on a predetermined standard.
Traditional (Assessment Strategy)
Assessment to see if a learner can meet the requirements based on memorization of data and facts.
Descriptive (Purpose of Learning Analytics Type)
Used to inform; based on data from gathered information.
Diagnostic (Purpose of Learning Analytics Type)
Analyzes past information to find out why something happened.
Predictive (Purpose of Learning Analytics Type)
Use data from the past to predict what might happen in the future.
Prescriptive (Purpose of Learning Analytics Type)
Offers recommendations based on possible outcomes and helps identify the best options.
Intrinsic Load
Difficulty and details of concept (cannot change).
Extraneous Load
Amount of processing imposed by lesson design (can decrease).
Germane Load
Interest generated by design of lesson (can increase).
Descriptive Goal
Report the number of students who passed the Chapter 2 test compared to the Chapter 1 test.
Descriptive Goal
Share a month-by-month breakdown of this past year's sales.
Diagnostic Goal
Discover why more students passed the Chapter 1 test than the Chapter 2 test.
Diagnostic Goal
Identify the cause for this year's decreasing number of graduating students this year.
Predictive Goal
Predict the number of students who will pass the Chapter 3 test.
Prescriptive Goal
Plan how to achieve an 80% passing rate on chapter tests.
Prescriptive Goal
Determine how to increase the number of graduating students for next year.
Goal
Lessons and lectures.