โ๏ธ Platform Quick Reference
| Platform | Who Can Create | Shareable? | Student Access | Best For |
|---|---|---|---|---|
| ChatGPT Custom GPTs | Plus/Team ($20/mo) | โ Public link | โ Free users can use | Distributable student tools |
| Claude Projects | Free or Pro | โ Cannot share | N/A | Personal teaching prep |
| Copilot Agents | Studio Lite (free) or Full | โ Tenant-dependent | Varies by institution | Privacy-sensitive, M365 integrated |
Free ChatGPT users can access and use Custom GPTs โ they just can't create them. This makes GPTs viable for distributing to students.
๐ The Universal Solution: Shareable Prompts
The most reliable way to give students access to a custom AI experience is a copy-paste prompt. This works with any free AI (ChatGPT, Claude, Copilot, Gemini) and requires no special accounts.
Create a well-crafted prompt document. Students paste it into any AI chat to activate the same behavior every time. Platform-agnostic, free-tier friendly, and you control the instructions.
Prompt Template Structure
Distributing to Students
Use Google Docs, a PDF, or a page on your course site. Include clear instructions at the top.
"Copy everything below the line. Open any AI chat (ChatGPT, Claude, etc.). Paste and send. Start a new chat for each session."
Add the link to your course materials, assignment instructions, or announcements.
๐ ๏ธ Platform How-To Guides
Create a GPT
Students on free ChatGPT accounts can use your GPT via the shared link. They can't create their own GPTs, but they can use yours.
Create a Project
Cannot be shared. Projects stay in your personal account. Best for your own teaching prep, grading assistance, or material development.
Student Workaround
Export your project's custom instructions as a shareable prompt document (see Universal Solution above).
Two Paths
Copilot Studio Lite (free, browser-based): Good for simple agents without complex data needs.
Full Copilot Studio: Dataverse integration, Power Automate, custom connectors. Requires IT involvement.
Create via Studio Lite
Sharing is tenant-restricted. You can only share with users in your Microsoft 365 tenant. Students on personal accounts or different tenants may not have access.
โ๏ธ Writing Effective Instructions
Whether you're configuring a Custom GPT, Claude Project, or shareable prompt, the same principles apply:
1. Define the Role Clearly
Be specific about expertise, personality, and scope.
2. Specify Pedagogical Approach
Guide how the AI should teach.
- Socratic: Ask questions before providing explanations
- Scaffolding: Break complex tasks into smaller steps
- Metacognitive: Ask students to explain their thinking
- Growth mindset: Praise process, not just correctness
3. Set Clear Boundaries
Prevent the AI from doing students' work for them.
4. Add Domain Context
Include course-specific information:
- Textbook or material references
- Key terminology definitions
- Expected student knowledge level
- Preferred statistical software (R, SPSS, etc.)
For faculty interested in game-like learning experiences:
- XP/Leveling: Track progress across problems solved
- Titles: Earn names like "Statistics Apprentice" โ "Data Detective"
- Hint economy: Students "spend" points for hints, encouraging independent attempts first
- Confidence ratings: Ask students to rate their confidence before revealing if they're correct
Note: These require more complex prompts and work best with Custom GPTs where you can maintain state within a conversation.
๐ Privacy & FERPA
Never upload identifiable student information to consumer AI tools. Strip names, student IDs, and personal details before using AI for any grading or feedback assistance.
Platform Privacy Comparison
| Consideration | ChatGPT | Claude | Copilot (Institutional) |
|---|---|---|---|
| Data used for training? | Opt-out available | Opt-out available | No (enterprise) |
| Institutional DPA? | ChatGPT Team/Enterprise | Claude for Work | Yes, via M365 |
| Best for student data? | Avoid | Avoid | Preferred |
Safe Practices
- Use institutional Copilot for anything involving student work
- De-identify all content before using consumer AI tools
- Inform students when AI tools are part of course activities
- Check your institution's AI policy โ many have specific guidance
๐ข Copilot Agents for Staff Productivity
Microsoft Copilot Agents can automate routine tasks, answer common questions, and streamline workflows. This section covers what staff can build and the two main paths to creating agents.
What Staff Can Build
Pulls relevant documents, past meeting notes, and action items before meetings. Summarizes context from emails and shared files.
Answers questions about institutional policies, HR procedures, or department guidelines using uploaded policy documents.
Walks new employees through forms, systems, and procedures. Points to relevant resources and contacts.
Answers common questions about your department's services, reducing repetitive email inquiries.
๐ฏ Copilot Studio Lite (The Accessible Path)
Staff who want to create simple agents without developer support or complex setup. Browser-based, no-code, quick to deploy.
Getting Started
Go to copilotstudio.microsoft.com and sign in with your institutional Microsoft account.
Click "Create" and describe what you want your agent to do. The AI will generate a starting configuration.
Upload documents, link SharePoint sites, or connect to public websites your agent should reference.
Define specific question-answer pairs or conversation flows for common queries.
Use the built-in test chat, then publish to Teams, a website, or other channels.
Studio Lite Limitations
- No Dataverse integration (can't store/query structured data)
- Limited Power Automate connections
- Fewer customization options than full Studio
- Sharing limited to your M365 tenant
โ๏ธ Full Copilot Studio (IT-Supported Projects)
For more complex agents that need to read/write data, integrate with other systems, or require advanced logic.
When to Escalate to IT
You Need Dataverse
Storing and querying structured records โ employee directories, equipment inventories, request tracking.
You Need Automation
Power Automate flows that create tickets, send emails, update spreadsheets, or trigger other systems.
You Need External Data
Custom connectors to APIs, databases, or third-party services outside M365.
You Need Broader Sharing
Agents accessible to external users, multiple tenants, or embedded in public-facing sites.
Understanding Dataverse
Dataverse is Microsoft's database platform within Power Platform. Think of it as structured storage for your agent:
- Tables: Like spreadsheet tabs โ Employees, Requests, Equipment, etc.
- Columns: Fields within tables โ Name, Status, Date, etc.
- Relationships: Connect tables โ link Requests to Employees who submitted them
- Security: Control who can read/write which records
User asks "What's the status of my ticket?" โ Agent queries Dataverse for tickets matching user's email โ Returns status and assigned technician โ Optionally triggers Power Automate to send notification.
Working with IT
When requesting support for a full Copilot Studio project:
- Define the use case: What problem does this solve? Who uses it?
- Document data needs: What information does the agent need to access or store?
- Identify integrations: What other systems should it connect to?
- Clarify ownership: Who maintains the agent after it's built?
๐ Sharing Agents in Your Organization
Tenant Basics
Your "tenant" is your organization's Microsoft 365 environment. By default, Copilot Agents can only be shared within your tenant.
Common Sharing Scenarios
- Same department: Usually works seamlessly
- Cross-department: Works if same tenant, but permissions may vary
- Partner institutions: Requires guest access configuration
- Students on personal accounts: Often doesn't work โ different tenant
- Public access: Possible but requires specific setup and approvals
If Users Can't Access Your Agent
Are they signed in with an institutional account or personal Microsoft account?
Some agent features require specific M365 licenses. Check with IT.
In Copilot Studio, check who the agent is published to.
IT may need to enable cross-tenant sharing or guest access.
๐ฆ Details About Our Bots
This section describes how we actually use custom chatbots in our courses. These aren't prescriptions โ just ideas that might spark your own approaches.
Faculty often ask "but what do you actually put in these things?" Here's what we do. Steal liberally, adapt freely, ignore what doesn't fit your teaching style.
๐ What We Upload (Knowledge Sources)
Gives the bot awareness of prerequisites, grading policies, schedule, and course structure. Students can ask "when is the exam?" and get accurate answers.
Defines what's actually testable content vs. enrichment. The bot knows what we emphasized, not just what's in the textbook.
We specify the edition and chapters covered. The bot treats textbook content as supplementary โ it enriches but doesn't override our materials.
For statistics: practice problem solutions with acceptable ranges. The bot can grade student work without requiring exact matches.
The Knowledge Hierarchy
We explicitly tell the bot how to prioritize sources:
๐ How Our Bots Teach
Zone of Proximal Development
We don't ask students to rate their confidence constantly (they find it tedious). Instead, the bot infers difficulty level from behavioral signals:
Efficiency Over Chattiness
Students report that bots can feel time-consuming. We explicitly tell the bot to be concise:
Academic Integrity Boundaries
The bot distinguishes between learning support and doing homework:
๐ Statistics Practice Bot (Example)
This bot helps students work through 50 practice problems (t-tests, ANOVA, correlation, regression, non-parametrics) using R's built-in datasets.
How It Works
- Problem presentation: Research question + dataset name + variables (students use
str(data)to explore) - Guided discovery: Asks diagnostic questions before confirming test choice
- Code debugging: Identifies errors without rewriting everything for them
- Flexible grading: Accepts values within rounding tolerance, doesn't require strict APA format
- Solutions on request: These are practice problems โ students can see complete solutions if stuck
Grading with Acceptable Ranges
๐ Course Content Bots (Example)
For our content courses (Neuroscience, Perception, etc.), the bot handles three types of interactions:
"What's the difference between rods and cones?" โ Concise, targeted answer from my materials.
Student seems confused โ Bot asks probing questions, breaks concepts down, connects to prior knowledge.
"Quiz me on Chapter 5" โ Generates practice questions at appropriate difficulty, gives metacognitive feedback.
Source Attribution
The bot tells students where information comes from:
- "From your lecture slides..." = testable, emphasized by me
- "The textbook also mentions..." = enrichment, may or may not be tested
- "In the broader field..." = general knowledge, definitely not tested
๐ซ What We Don't Use Bots For
Privacy concerns. We don't upload identifiable student work to consumer AI tools.
Names, IDs, personal situations โ none of that goes into these tools.
Complex issues, emotional support, nuanced advising โ that's human work.
๐ง Learning Science Principles Baked In
These aren't just good ideas โ they're explicitly in our bot instructions:
Practice questions are ~20% harder than assessment questions. Struggle improves learning.
Quiz practice includes 10-15% questions from prior chapters, not just current material.
Feedback explains why answers are right/wrong, not just correctness. Guides thinking.
Questions ask "why" and "how," not just "what." Connects to real-world examples students recognize.
For courses where it fits the vibe (like our intro courses), we sometimes add game-like elements. This isn't for everyone, but students who engage with it really engage.
Hint Economy
Students can request hints, but hints have a "cost" โ encouraging independent attempts first:
Other Elements We've Tried
- XP/Leveling: Track cumulative problems solved across sessions
- Titles: "Statistics Apprentice" โ "Data Detective" โ "Analysis Wizard"
- Confidence bets: Students predict if they're right before seeing feedback
- Streak bonuses: Consecutive correct answers unlock fun facts or "Easter eggs"
Gamification adds complexity and can feel forced if it doesn't match your teaching style. Start simple. Add elements only if students respond well.
๐ก Adapt, Don't Copy
Everything here reflects our teaching styles, our courses, and our students. Your context is different. Take what resonates, ignore what doesn't, and build something that works for you.
The best bot is one that feels like a natural extension of how you already teach โ not a replacement for your judgment or your relationship with students.
If you're at our institution and want to see these in action or talk through your own bot ideas, find us at AI Office Hours or just email. Happy to share prompt files directly.