1/36
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Use AI as a support tool
not as a substitute for your own thinking, learning, and authorship.
What did this powerpoint talk about?
Honesty, Judgement, Accountability
Why ethics and GenAI belong in a CS course
Technical skill matters — but so do judgment, integrity, privacy, and responsibility.
Why ethics and GenAI belong in a CS
course includes...
Honesty, Fairness, Privacy, Accountability
Honesty
Say what work is yours, what
help you used, and what you
actually understand.
Fairness
Do not gain an unfair advantage
over classmates by outsourcing
the learning.
Privacy
Protect personal data, course
materials, proprietary code,
and other sensitive
information.
Accountability
You are responsible for every
line, claim, citation, and
submission with your name on
it.
In practice, this means:
✓Try the problem
yourself before asking
AI.
✓ Use AI to support
learning, not replace it.
✓ Check outputs for
errors and bias.
✓ Disclose or cite AI use
when required.
✓ Ask if a use case feels
unclear
Ethical computing is not an
extra topic;
it is part of being a
competent computer scientist.
What GenAI can help with
Allowed uses should make you a better learner, writer, programmer, and debugger — not
replace your role.
What GenAI can help...
✓ Brainstorming ideas or test cases before you start
coding.
✓ Explaining an error message, algorithm, or concept in
simpler language.
✓ Generating practice problems, flashcards, or study
guides for review.
✓ Improving readability: naming, comments, formatting,
or documentation.
✓ Comparing approaches after you have attempted the
work yourself.
DO Use AI to support understanding — and be ready to
explain every part of your final work.
What GenAI must not do for you
If the tool replaces your authorship, hides misconduct, or creates unfair advantage, it crosses
the line.
What GenAI must not do for you...
! Submitting AI-generated code, prose, or answers as if
they were entirely your own.
! Using AI during quizzes, exams, or restricted
assessments unless explicitly permitted.
! Pasting in private data, another student's work, or
proprietary / licensed course material.
! Fabricating citations, results, sources, experiments, or
references.
! Hiding your AI use when the assignment or instructor
requires disclosure.
Rule of thumb: if the AI is doing the learning, reasoning, or
authorship for you, do not submit it.
Use this workflow every time you use AI
ask, think, check,cite, own it
A simple decision process can keep your work
ethical, accurate, and course-compliant.
Best practice:
keep a short record of how you used AI prompts, what you accepted, and what you changed.
Traffic-light examples
green,yellow,red,red
Green
"Give me five extra test cases for my sorting
program."
AI supports practice and debugging; you still design, code, and
evaluate the solution
Yellow
Rewrite my explanation so it sounds more
polished."
This may be okay, but only if the ideas are yours and the
assignment allows editing help.
Red
"Solve the assignment and give me the final code
to submit."
Not okay. The tool is replacing your authorship and the learning
the task is meant to assess.
RED 2
Here is another student's code. Improve it for
me."
Not okay. This risks privacy, plagiarism, and misuse of someone
else's work.
Bottom line for students
If a use is not clearly allowed, do not guess — ask first.
Your personal checklist before
submitting:
✓I understand every part of the work.
✓ I tested the output and corrected
errors.
✓ I did not upload private or restricted
material.
✓ I disclosed AI help if the course
requires it.
✓ The final submission is genuinely
mine.
Remember
-AI can be a useful tutor, editor, explainer, or
brainstorming partner.
-It cannot become the real author of your
coursework.
-When in doubt, ask your instructor before
you submit.
What is one principle of the ACM Code of Ethics for Computer Scientists?
Contribute to society and human well-being.
What should computer scientists avoid when designing computing systems?
Harm.
What is expected of computer scientists in their professional work?
To be honest and trustworthy.
What should computer scientists respect according to the ACM Code of Ethics?
Privacy, intellectual property, and confidentiality.
What responsibility do computer scientists have regarding computing systems?
For the quality and impact of computing systems.
What can generative AI produce?
Incorrect or fabricated information.
What may AI-generated code include?
Non-existent libraries or incorrect APIs.
What can hallucinated code introduce?
Bugs or security vulnerabilities.
What must developers do with AI-generated outputs?
Review, test, and validate them.
AI Risk Stack
Human oversight, privacy risks, security vulnerabilities, hallucinations, bias & fairness
AI Hallucination Flow
Student prompt, AI Model Generates Code, Incorrect/Invented API, Student Does not Verify, Bug or vulnerability
Discussion: Ethical AI
Use in Programming
Should AI-generated code require
disclosure in coursework?
• How can developers verify AI-generated
solutions?
• What risks occur when students rely too
heavily on AI?
• How should industry regulate AI-assisted
software development?