C4OECD AILit — Creating with AI, Competency 4

Ethics in AI Creation

With Great Power Comes Responsibility

AI makes content creation easier than ever — but that power comes with ethical questions. When you use AI to create, who owns the output? Should you disclose AI involvement? Is it okay to submit AI-generated work as your own? These aren’t just theoretical questions — they affect real decisions every day.

Ethics Judge

Read each scenario and decide: is this ethical, unethical, or does it depend on context?

Scenario 1/5
A student uses ChatGPT to generate an entire essay and submits it as their own work without telling the teacher.

Ethical Guidelines for AI Creation

Transparency — Disclose when AI was used in creating content, especially in professional or academic settings.

Attribution — Don’t claim AI-generated content as entirely your own original work.

Verification — Always fact-check AI outputs before publishing. You’re responsible for what you share.

Fair use — Be mindful that AI models were trained on others’ work. Respect original creators.

Purpose — Use AI to augment your abilities, not to deceive others.
When NOT to Use AI
Don’t use AI for: academic submissions without permission, impersonating others, creating misleading content, generating harmful material, or bypassing systems designed to verify human authorship.

Check Your Understanding

1. Is it ethical to use AI to help brainstorm ideas for your own writing?
2. Should you disclose AI use in a professional report?
3. A student submits an AI-written essay as their own. This is:
4. What is the creator’s responsibility when publishing AI-generated content?

Answer all questions. You need 70% to pass.