A redesigned assessment for one of your existing assignments that tests genuine understanding (which AI can't fake), plus a student-facing AI use policy you can actually enforce.
Understanding the Detection Problem
Let's start with what's true: AI detection tools are unreliable. They flag human writing as AI and miss AI writing regularly. Multiple studies and real-world reports confirm this. If your integrity strategy depends on "catching" students with a detector, you're building on sand.
The more productive question: what assessments actually require the skills you're trying to develop, and what assessments were already outsourceable (to tutors, Chegg, study groups) before AI existed?
High AI risk (easy to outsource):
→ Generic essays on broad topics
→ Summarize/explain assignments
→ Standard research papers
→ Fill-in-the-blank assessments
Low AI risk (requires authentic thinking):
→ Analyzing personal experiences
→ Responding to class-specific discussions
→ In-person presentations with Q&A
→ Iterative portfolio with revision history
→ Performance tasks with live demonstrationRedesigning Assessments for AI Age
The most durable solution is assessments that can't be answered without genuine engagement. Use AI to help you redesign existing assessments.
I have this assessment:
[describe or paste your existing assessment]
Learning objectives it's supposed to test:
[list objectives]
Redesign this assessment so that:
1. It requires personal knowledge or class-specific context
2. It's harder to complete with AI alone
3. It still tests the same core learning objectives
4. It's practical for a class of [N] students
Give me 3 alternative versions and explain
why each is harder to outsource.Student AI Policy: Clear, Enforceable, Honest
Students need clear guidance — not ambiguous blanket bans. A policy that says "no AI use ever" in a world where students use AI for everything else is both unenforceable and poor preparation for their careers.
The most effective policies define when and how AI use is appropriate, not just prohibit it.
Assignment Type 1: AI-Assisted
Students may use AI to brainstorm, draft, or edit.
Required: submit AI prompts and original response
alongside final work. Explain what you changed and why.
Assignment Type 2: AI-Informed
Students may use AI for research and background.
Prohibited: using AI to write any submitted text.
Required: document AI sources like any other source.
Assignment Type 3: AI-Free
Completed without any AI assistance.
Typically: in-class work, final exams, personal narratives.Use AI to draft this policy for your class level and subject area:
Draft a clear AI use policy for my class:
Grade: [grade]
Subject: [subject]
Age group: [age]
The policy should:
- Explain why we're addressing AI (not assuming it's bad)
- Define three tiers of AI use for different assignments
- Explain consequences for violations clearly
- Be written for students, not administrators
- Be under 400 wordsWhat You Learned Today
- Why AI detection tools are unreliable and why 'catching' students is the wrong strategy
- How to identify which assessments are high vs. low AI risk
- How to redesign assessments that require authentic thinking AI can't fake
- How to write a tiered AI use policy that's clear, enforceable, and honest with students
Go Further on Your Own
- Take your three most common assignment types. Use AI to redesign each one to reduce outsourceability without losing the learning objective
- Write a 'class AI conversation guide' — questions you can ask students to have a genuine discussion about when AI helps learning vs. hinders it
- Research your school or district's official AI policy. Compare it to what you just drafted. Where does it need to be updated?
Nice work. Keep going.
Day 4 is ready when you are.
Continue to Day 4Want live instruction and hands-on projects? Join the AI bootcamp — 3 days, 5 cities.