D293 Glossary Assessment and Learning Analytics

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/79

flashcard set

Earn XP

Description and Tags

Flashcards from D293 Glossary Assessment and Learning Analytics

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

80 Terms

1
New cards

Ipsative Assessment

Measures student learning against past knowledge or performance. It can help motivate learners by showing the distance progressed, even if the learner's performance is not yet at the level of an expert or some classmates.

2
New cards

Formative Assessment

Used to determine learning as we instruct and adjust content complexity and instruction.

3
New cards

Summative Assessment

Used to determine learning at the end in comparison to the objectives.

4
New cards

Diagnostic Assessment

Designed to test a learner’s knowledge BEFORE they begin an activity or lesson; often referred to as pre-assessment.

5
New cards

Direct Assessment

Used to evaluate a learner's understanding of a concept, achievement of a learning objective, or completion of a goal through direct evaluation of the learner's work.

6
New cards

Indirect Assessment

Does not look at learners' actual work but instead uses information gathered from other sources, such as attendance or time on task.

7
New cards

Competency-based Assessment

Focuses on skills more than knowledge; often an example of authentic assessment centering on learners applying skills and knowledge.

8
New cards

Comprehensive Assessment

Provides various ways for the instructor to monitor a learner's academic achievement and progress; includes benchmark, formative, summative, and diagnostic assessments.

9
New cards

Authentic Assessment

Involves the application of knowledge and skills in real-world situations, scenarios, or problems, creating a student-centered learning experience.

10
New cards

Criterion-referenced Assessment

Measures student learning based on a concrete learning standard, objective, or outcome.

11
New cards

Norm-referenced Assessment

Uses assessment score results to create a comparative score of how learners did relative to the scores of other learners (e.g., grading on a curve).

12
New cards

Reflection-focused Assessment

Allows the learner to look back and reflect on the learning experiences, promoting self-regulation of learning.

13
New cards

Project-based Assessment

A concrete way to assess the learner other than a test, often involving creative outlets; an example is asking students to create a website.

14
New cards

Psychomotor Domain

Focuses on physical skills; includes perception, set, guided response, mechanism, complex overt response, adaptation, and origination.

15
New cards

Affective Domain

Includes five areas of emotional response, categorized as simple to complex ways of processing feelings and attitude: receiving, responding, valuing, organizing, and characterizing.

16
New cards

Cognitive Domain

Develops six areas of intellectual skills that build sequentially from simple to complex behaviors: remembering, understanding, applying, analyzing, evaluating, and creating.

17
New cards

Descriptive Data Analytics

Can tell us “what happened”; the most common type of data used in schools, such as test scores, attendance records, and feedback surveys.

18
New cards

Diagnostic Data Analytics

Examines data or content to answer the question, “Why did it happen?”; characterized by techniques such as drill-down, data discovery, and data mining.

19
New cards

Predictive Data Analytics

Offers insights into “what is likely to happen”; uses the results of descriptive and diagnostic analytics to predict future trends.

20
New cards

Prescriptive Data Analytics

Analyzes analytics to determine “What should be done?”; relies not only on the quality of analytics but also the accuracy to ensure well-thought decisions.

21
New cards

Quantitative Analysis

Based on objective, numerical data and statistics; often used in descriptive analytics to determine what happened.

22
New cards

Qualitative Analysis

Based on non-numerical information, such as observations, reflections, and interviews; often used in diagnostic analytics to determine why something happened.

23
New cards

Social Network Analysis

The study of patterns or trends in relationships among groups of learners or between learners and instructors; used in predictive analytics to predict future interactions.

24
New cards

Data for Research

Used to gather new data and test new theories, collecting more data than necessary.

25
New cards

Data for Accountability

Used to evaluate, rate, or rank performance, collecting all recent and relevant data available.

26
New cards

Data for Improvement

Used to observe student performance to answer questions about the effectiveness of instruction, collecting “just enough” data.

27
New cards

Task-level Feedback

Tells how well the learner has performed a specific task; effective when distinguishing correct from incorrect answers but not effective over time.

28
New cards

Process-level Feedback

Specific to the processes used for tasks; challenges the student to form a deeper understanding of the learning and encourages them to construct meaning on their own.

29
New cards

Regulatory-level Feedback

Addresses the way that the learner examines and adjusts his actions toward his learning goal; encourages the learner to self-assess to a more critical degree.

30
New cards

Self-level Feedback

Focuses more on the person than on the work and is the least effective; typically builds a student up but gives them little information for improvement.

31
New cards

Actionable Feedback

Should help the individual understand what they need to do differently and how to do it; provides precise details about what was done well or what needs improvement.

32
New cards

Mastery-orientated feedback (UDL)

Should “guide learners toward mastery rather than a fixed notion of performance or compliance; towards successful long-term habits and learning practices”.

33
New cards

Multiple Means of Representation (UDL)

Offering information in more than one format (e.g., text, audio, video, hands-on learning).

34
New cards

Multiple Means of Action and Expression

Providing more than one way to interact with the material and to show what they’ve learned (e.g., pencil-and-paper test, oral presentation, group project).

35
New cards

Multiple Means of Engagement

Look for multiple ways to motivate students (e.g., letting kids make choices, giving them assignments that feel relevant).

36
New cards

Perceivable (POUR)

Means the user can identify content and interface elements by means of the senses; perceiving through sight, sound, touch, smell, or taste.

37
New cards

Operable (POUR)

Means that a user can successfully use controls, buttons, navigation, and other necessary interactive elements (e.g., clicking, tapping, swiping, using keyboard or voice commands).

38
New cards

Understandable (POUR)

Technology that is consistent in its presentation and format, predictable in its design and usage patterns, concise, multimodal, and appropriate to the audience in its voice and tone.

39
New cards

Robust (POUR)

IT that is standards-compliant and designed to function on all appropriate technologies, allowing users to choose the technology they use to interact with information.

40
New cards

Construct-validity Bias

Refers to whether a test accurately measures what it was designed to measure; can be affected by language skills rather than academic abilities.

41
New cards

Content-validity Bias

Occurs when the content of a test is comparatively more difficult for one group of students than for others due to unequal opportunity to learn or unfair scoring.

42
New cards

Predictive-validity Bias

Refers to a test’s accuracy in predicting how well a certain student group will perform in the future; an unbiased test predicts future performance equally well for all groups.

43
New cards

Intrinsic Load

Refers to the complexity of what you are learning, including the amount of new information and how it all interacts; an essential part of the learning task over which we don’t have control.

44
New cards

Germane Load

Refers to the effort needed to use memory and intelligence to process information into schemas; how we process new information into long-term memory.

45
New cards

Extraneous Load

Results when a learning experience is unnecessarily difficult or confusing, using up cognitive resources that learners could direct at the learning task; a result of poor learning design.

46
New cards

Data-ownership and Control

Institutions should be aware of issues around third-party sharing, especially since sharing might include student data; involves the question of who owns the data.

47
New cards

Transparency

Institutional transparency might best begin by making clear to students and to other stakeholders the purpose of learning analytics; relates primarily to how student data is collected, analyzed and used.

48
New cards

Consent

Consent to collect student data should be sought at the point of registration, including transparency and potentially with a later option to withdraw consent.

49
New cards

Validity and Reliable Data

The institution needs to ensure that data collected and analyzed is accurate and representative of the issue being measured; the results should be transparent and clearly understood.

50
New cards

Audio Feedback

Feedback that can be recorded in many learning management systems and left directly in assignments for learners to review; needs to be recorded at an appropriate pace and feature clear pronunciation.

51
New cards

Discussion Feedback

Instructors can offer feedback to the class as a whole or to individual students; allows instructors to encourage students to engage with the material and each other deeper.

52
New cards

Email Feedback

Written feedback that can be delivered to students via email, allowing them to access this feedback outside of their time spent in the university’s system.

53
New cards

Peer-feedback

Can occur actively in discussions, through content sharing aspects of learning management systems, and when group projects are assigned; students evaluate their peers' work.

54
New cards

Screen-sharing Feedback

Instructors can offer synchronous or asynchronous feedback while sharing screens with students; needs to be high quality.

55
New cards

Remember (Bloom's Taxonomy)

Retrieve relevant knowledge from long-term memory.

56
New cards

Understand (Bloom's Taxonomy)

Construct meaning from instructional messages, including oral, written and graphic communication.

57
New cards

Apply (Bloom's Taxonomy)

Carry out or use a procedure in a given situation.

58
New cards

Analyze (Bloom's Taxonomy)

Break material into foundational parts and determine how parts relate to one another and the overall structure or purpose.

59
New cards

Evaluate (Bloom's Taxonomy)

Make judgments based on criteria and standards.

60
New cards

Create (Bloom's Taxonomy)

Put elements together to form a coherent whole; reorganize into a new pattern or structure.

61
New cards

Criterion-referenced (Assessment Strategy)

Assessment to see if a learner has met predetermined milestones and requirements.

62
New cards

Ipsative (Assessment Strategy)

Assessment to see if a learner has improved based on previous knowledge.

63
New cards

Norm-referenced (Assessment Strategy)

Assessment to see how a learner's work compares to the average work completed by a similar group of learners.

64
New cards

Standards-based (Assessment Strategy)

Assessment to see if a learner can meet requirements and have a mastery of knowledge based on a predetermined standard.

65
New cards

Traditional (Assessment Strategy)

Assessment to see if a learner can meet the requirements based on memorization of data and facts.

66
New cards

Descriptive (Purpose of Learning Analytics Type)

Used to inform; based on data from gathered information.

67
New cards

Diagnostic (Purpose of Learning Analytics Type)

Analyzes past information to find out why something happened.

68
New cards

Predictive (Purpose of Learning Analytics Type)

Use data from the past to predict what might happen in the future.

69
New cards

Prescriptive (Purpose of Learning Analytics Type)

Offers recommendations based on possible outcomes and helps identify the best options.

70
New cards

Intrinsic Load

Difficulty and details of concept (cannot change).

71
New cards

Extraneous Load

Amount of processing imposed by lesson design (can decrease).

72
New cards

Germane Load

Interest generated by design of lesson (can increase).

73
New cards

Descriptive Goal

Report the number of students who passed the Chapter 2 test compared to the Chapter 1 test.

74
New cards

Descriptive Goal

Share a month-by-month breakdown of this past year's sales.

75
New cards

Diagnostic Goal

Discover why more students passed the Chapter 1 test than the Chapter 2 test.

76
New cards

Diagnostic Goal

Identify the cause for this year's decreasing number of graduating students this year.

77
New cards

Predictive Goal

Predict the number of students who will pass the Chapter 3 test.

78
New cards

Prescriptive Goal

Plan how to achieve an 80% passing rate on chapter tests.

79
New cards

Prescriptive Goal

Determine how to increase the number of graduating students for next year.

80
New cards

Goal

Lessons and lectures.