Custom AI Chatbots for Higher Ed

A practical guide to creating AI assistants with ChatGPT, Claude, and Microsoft Copilot. Separate sections for faculty (educational chatbots), staff (productivity agents), and real-world examples.

โš–๏ธ Platform Quick Reference

Platform Who Can Create Shareable? Student Access Best For
ChatGPT Custom GPTs Plus/Team ($20/mo) โœ“ Public link โœ“ Free users can use Distributable student tools
Claude Projects Free or Pro โœ— Cannot share N/A Personal teaching prep
Copilot Agents Studio Lite (free) or Full โš  Tenant-dependent Varies by institution Privacy-sensitive, M365 integrated
Key Correction

Free ChatGPT users can access and use Custom GPTs โ€” they just can't create them. This makes GPTs viable for distributing to students.

๐Ÿ“‹ The Universal Solution: Shareable Prompts

The most reliable way to give students access to a custom AI experience is a copy-paste prompt. This works with any free AI (ChatGPT, Claude, Copilot, Gemini) and requires no special accounts.

Recommended Approach

Create a well-crafted prompt document. Students paste it into any AI chat to activate the same behavior every time. Platform-agnostic, free-tier friendly, and you control the instructions.

Prompt Template Structure

# [Your Chatbot Name] ## Role You are [role description]. You help students with [subject/task]. ## Behavior - Use Socratic questioning: ask guiding questions before giving answers - Break complex problems into steps - When students make errors, identify the type of error and ask them to reconsider - Celebrate effort and progress, not just correct answers ## Boundaries - Never provide complete solutions to homework problems - If asked to "just give the answer," redirect to guided problem-solving - Do not do the student's work for them ## Subject Knowledge [Add course-specific context, terminology, or reference materials here] ## First Message Begin by introducing yourself and asking what topic the student wants to work on.

Distributing to Students

Create your prompt document

Use Google Docs, a PDF, or a page on your course site. Include clear instructions at the top.

Add student instructions

"Copy everything below the line. Open any AI chat (ChatGPT, Claude, etc.). Paste and send. Start a new chat for each session."

Link from your LMS

Add the link to your course materials, assignment instructions, or announcements.

๐Ÿ› ๏ธ Platform How-To Guides

ChatGPT Custom GPTs
Plus Required to Create Shareable Best for Students

Create a GPT

Go to chatgpt.com โ†’ Explore GPTs โ†’ Create
Configure name, description, and instructions
Upload knowledge files (syllabi, guides, datasets)
Publish โ†’ "Anyone with a link" โ†’ Copy link

Students on free ChatGPT accounts can use your GPT via the shared link. They can't create their own GPTs, but they can use yours.

Claude Projects
Free Tier Available Cannot Share

Create a Project

Go to claude.ai โ†’ Projects (sidebar) โ†’ New Project
Add custom instructions (persistent system prompt)
Upload knowledge files (100+ documents)
Use โ€” but only you can access this project

Cannot be shared. Projects stay in your personal account. Best for your own teaching prep, grading assistance, or material development.

Student Workaround

Export your project's custom instructions as a shareable prompt document (see Universal Solution above).

Microsoft Copilot Agents
Licensing Varies Org Sharing

Two Paths

Copilot Studio Lite (free, browser-based): Good for simple agents without complex data needs.

Full Copilot Studio: Dataverse integration, Power Automate, custom connectors. Requires IT involvement.

Create via Studio Lite

Go to copilotstudio.microsoft.com
Sign in with your institutional Microsoft account
Create new agent โ†’ describe its purpose
Configure topics, knowledge sources, and behavior

Sharing is tenant-restricted. You can only share with users in your Microsoft 365 tenant. Students on personal accounts or different tenants may not have access.

โœ๏ธ Writing Effective Instructions

Whether you're configuring a Custom GPT, Claude Project, or shareable prompt, the same principles apply:

1. Define the Role Clearly

Be specific about expertise, personality, and scope.

Vague: "You help with statistics." Better: "You are a statistics tutor specializing in behavioral science research methods. You help undergraduate students understand hypothesis testing, correlation, and regression using real-world psychology examples."

2. Specify Pedagogical Approach

Guide how the AI should teach.

  • Socratic: Ask questions before providing explanations
  • Scaffolding: Break complex tasks into smaller steps
  • Metacognitive: Ask students to explain their thinking
  • Growth mindset: Praise process, not just correctness

3. Set Clear Boundaries

Prevent the AI from doing students' work for them.

"Never provide complete solutions. When a student asks for an answer, first ask them to show their work or explain their current understanding. Guide them toward the answer through questions."

4. Add Domain Context

Include course-specific information:

  • Textbook or material references
  • Key terminology definitions
  • Expected student knowledge level
  • Preferred statistical software (R, SPSS, etc.)
Advanced: Gamification Elements (Optional) โ–ผ

For faculty interested in game-like learning experiences:

  • XP/Leveling: Track progress across problems solved
  • Titles: Earn names like "Statistics Apprentice" โ†’ "Data Detective"
  • Hint economy: Students "spend" points for hints, encouraging independent attempts first
  • Confidence ratings: Ask students to rate their confidence before revealing if they're correct

Note: These require more complex prompts and work best with Custom GPTs where you can maintain state within a conversation.

๐Ÿ”’ Privacy & FERPA

Core Principle

Never upload identifiable student information to consumer AI tools. Strip names, student IDs, and personal details before using AI for any grading or feedback assistance.

Platform Privacy Comparison

Consideration ChatGPT Claude Copilot (Institutional)
Data used for training? Opt-out available Opt-out available No (enterprise)
Institutional DPA? ChatGPT Team/Enterprise Claude for Work Yes, via M365
Best for student data? Avoid Avoid Preferred

Safe Practices

  • Use institutional Copilot for anything involving student work
  • De-identify all content before using consumer AI tools
  • Inform students when AI tools are part of course activities
  • Check your institution's AI policy โ€” many have specific guidance

๐Ÿข Copilot Agents for Staff Productivity

Microsoft Copilot Agents can automate routine tasks, answer common questions, and streamline workflows. This section covers what staff can build and the two main paths to creating agents.

What Staff Can Build

๐Ÿ“… Meeting Prep Assistant

Pulls relevant documents, past meeting notes, and action items before meetings. Summarizes context from emails and shared files.

๐Ÿ“– Policy Lookup Bot

Answers questions about institutional policies, HR procedures, or department guidelines using uploaded policy documents.

๐Ÿ‘‹ Onboarding Guide

Walks new employees through forms, systems, and procedures. Points to relevant resources and contacts.

โ“ Department FAQ Handler

Answers common questions about your department's services, reducing repetitive email inquiries.

๐ŸŽฏ Copilot Studio Lite (The Accessible Path)

Who This Is For

Staff who want to create simple agents without developer support or complex setup. Browser-based, no-code, quick to deploy.

Getting Started

Access Copilot Studio

Go to copilotstudio.microsoft.com and sign in with your institutional Microsoft account.

Create New Agent

Click "Create" and describe what you want your agent to do. The AI will generate a starting configuration.

Add Knowledge Sources

Upload documents, link SharePoint sites, or connect to public websites your agent should reference.

Configure Topics

Define specific question-answer pairs or conversation flows for common queries.

Test & Publish

Use the built-in test chat, then publish to Teams, a website, or other channels.

Studio Lite Limitations

  • No Dataverse integration (can't store/query structured data)
  • Limited Power Automate connections
  • Fewer customization options than full Studio
  • Sharing limited to your M365 tenant

โš™๏ธ Full Copilot Studio (IT-Supported Projects)

For more complex agents that need to read/write data, integrate with other systems, or require advanced logic.

When to Escalate to IT

You Need Dataverse

Storing and querying structured records โ€” employee directories, equipment inventories, request tracking.

You Need Automation

Power Automate flows that create tickets, send emails, update spreadsheets, or trigger other systems.

You Need External Data

Custom connectors to APIs, databases, or third-party services outside M365.

You Need Broader Sharing

Agents accessible to external users, multiple tenants, or embedded in public-facing sites.

Understanding Dataverse

Dataverse is Microsoft's database platform within Power Platform. Think of it as structured storage for your agent:

  • Tables: Like spreadsheet tabs โ€” Employees, Requests, Equipment, etc.
  • Columns: Fields within tables โ€” Name, Status, Date, etc.
  • Relationships: Connect tables โ€” link Requests to Employees who submitted them
  • Security: Control who can read/write which records
Example: IT Help Desk Agent

User asks "What's the status of my ticket?" โ†’ Agent queries Dataverse for tickets matching user's email โ†’ Returns status and assigned technician โ†’ Optionally triggers Power Automate to send notification.

Working with IT

When requesting support for a full Copilot Studio project:

  • Define the use case: What problem does this solve? Who uses it?
  • Document data needs: What information does the agent need to access or store?
  • Identify integrations: What other systems should it connect to?
  • Clarify ownership: Who maintains the agent after it's built?

๐Ÿ”— Sharing Agents in Your Organization

Tenant Basics

Your "tenant" is your organization's Microsoft 365 environment. By default, Copilot Agents can only be shared within your tenant.

Common Sharing Scenarios

  • Same department: Usually works seamlessly
  • Cross-department: Works if same tenant, but permissions may vary
  • Partner institutions: Requires guest access configuration
  • Students on personal accounts: Often doesn't work โ€” different tenant
  • Public access: Possible but requires specific setup and approvals

If Users Can't Access Your Agent

Check their account type

Are they signed in with an institutional account or personal Microsoft account?

Verify licensing

Some agent features require specific M365 licenses. Check with IT.

Review sharing settings

In Copilot Studio, check who the agent is published to.

Request broader permissions

IT may need to enable cross-tenant sharing or guest access.

๐Ÿฆ„ Details About Our Bots

๐Ÿ

Frau Prof. Awesome-Sauce v 2.0

a.k.a. The Bee Prof
๐Ÿฆ„

Herr Prof. Dr. Awesome-Sauce

a.k.a. The Unicorn Prof

This section describes how we actually use custom chatbots in our courses. These aren't prescriptions โ€” just ideas that might spark your own approaches.

Why Share This?

Faculty often ask "but what do you actually put in these things?" Here's what we do. Steal liberally, adapt freely, ignore what doesn't fit your teaching style.

๐Ÿ“ What We Upload (Knowledge Sources)

๐Ÿ“‹ Syllabus

Gives the bot awareness of prerequisites, grading policies, schedule, and course structure. Students can ask "when is the exam?" and get accurate answers.

๐Ÿ“Š PowerPoint Decks

Defines what's actually testable content vs. enrichment. The bot knows what we emphasized, not just what's in the textbook.

๐Ÿ“š Textbook Info

We specify the edition and chapters covered. The bot treats textbook content as supplementary โ€” it enriches but doesn't override our materials.

๐Ÿ“ Solutions Manuals

For statistics: practice problem solutions with acceptable ranges. The bot can grade student work without requiring exact matches.

The Knowledge Hierarchy

We explicitly tell the bot how to prioritize sources:

Prompt Snippet
When answering questions, use this priority: 1. My course materials (PPT, syllabus) = definitive, testable content 2. Textbook = supplementary enrichment, clearly labeled as such 3. General knowledge = only when materials are silent, with caveat If something appears in my materials AND the textbook differently, my materials take precedence.

๐ŸŽ“ How Our Bots Teach

Zone of Proximal Development

We don't ask students to rate their confidence constantly (they find it tedious). Instead, the bot infers difficulty level from behavioral signals:

Prompt Snippet
When a student asks a content question: 1. First ask what they already understand 2. If stuck, break into smaller parts: "Let's start with..." 3. If still stuck, provide a worked example of a similar concept 4. Only after scaffolding fails, provide the direct explanation Monitor for hesitation language ("I think...", "maybe...") as signals to increase support. Monitor for easy correct answers as signals to increase challenge.

Efficiency Over Chattiness

Students report that bots can feel time-consuming. We explicitly tell the bot to be concise:

Prompt Snippet
Batch interactions. Ask 3-4 questions at once, give feedback together. Kill these patterns: - "Are you ready for the next question?" - "Would you like me to explain further?" - Restating the question in feedback - Excessive praise per item One-exchange test: Does this create NEW learning or am I being chatty?

Academic Integrity Boundaries

The bot distinguishes between learning support and doing homework:

Prompt Snippet
When a student asks about graded work: "I can see this is from your [assignment]. I can't provide the answer, but I can help you build understanding to solve it yourself. What concept from your materials do you think applies?" Then use Socratic questioning to scaffoldโ€”never solve directly.

๐Ÿ“ˆ Statistics Practice Bot (Example)

This bot helps students work through 50 practice problems (t-tests, ANOVA, correlation, regression, non-parametrics) using R's built-in datasets.

How It Works

  • Problem presentation: Research question + dataset name + variables (students use str(data) to explore)
  • Guided discovery: Asks diagnostic questions before confirming test choice
  • Code debugging: Identifies errors without rewriting everything for them
  • Flexible grading: Accepts values within rounding tolerance, doesn't require strict APA format
  • Solutions on request: These are practice problems โ€” students can see complete solutions if stuck

Grading with Acceptable Ranges

Prompt Snippet
When grading practice problem answers: - Accept values within rounding tolerance (ยฑ0.05 for statistics, ยฑ0.001 for p-values) - Don't require strict APA formatโ€”key statistics + conclusion is sufficient - If significant result, check for effect size; prompt if missing - Provide specific feedback: "โœ“ t-statistic correct, โœ— missing Cohen's d" Philosophy: These are PRACTICE problems. Learning is the goal, not catching students in minor formatting errors.

๐Ÿ“š Course Content Bots (Example)

For our content courses (Neuroscience, Perception, etc.), the bot handles three types of interactions:

โšก Quick Questions

"What's the difference between rods and cones?" โ†’ Concise, targeted answer from my materials.

๐Ÿ” Guided Learning

Student seems confused โ†’ Bot asks probing questions, breaks concepts down, connects to prior knowledge.

๐Ÿ“ Quiz Practice

"Quiz me on Chapter 5" โ†’ Generates practice questions at appropriate difficulty, gives metacognitive feedback.

Source Attribution

The bot tells students where information comes from:

  • "From your lecture slides..." = testable, emphasized by me
  • "The textbook also mentions..." = enrichment, may or may not be tested
  • "In the broader field..." = general knowledge, definitely not tested

๐Ÿšซ What We Don't Use Bots For

โŒ Grading Actual Assignments

Privacy concerns. We don't upload identifiable student work to consumer AI tools.

โŒ Anything with Student PII

Names, IDs, personal situations โ€” none of that goes into these tools.

โŒ Replacing Office Hours

Complex issues, emotional support, nuanced advising โ€” that's human work.

๐Ÿง  Learning Science Principles Baked In

These aren't just good ideas โ€” they're explicitly in our bot instructions:

๐Ÿ“Š Desirable Difficulty

Practice questions are ~20% harder than assessment questions. Struggle improves learning.

๐Ÿ”„ Spaced Retrieval

Quiz practice includes 10-15% questions from prior chapters, not just current material.

๐Ÿ’ญ Metacognitive Feedback

Feedback explains why answers are right/wrong, not just correctness. Guides thinking.

๐Ÿ”— Elaborative Processing

Questions ask "why" and "how," not just "what." Connects to real-world examples students recognize.

๐ŸŽฎ Advanced: Gamification Elements โ–ผ

For courses where it fits the vibe (like our intro courses), we sometimes add game-like elements. This isn't for everyone, but students who engage with it really engage.

Hint Economy

Students can request hints, but hints have a "cost" โ€” encouraging independent attempts first:

Prompt Snippet
OPTIONAL: Hint Economy - Students can request hints, but hints "cost" points - Encourages independent attempts before seeking help - Track: "Hints used: 2/3 available for this problem" - Reward streaks of correct answers without hints The goal isn't punishment โ€” it's making the decision to seek help a conscious choice rather than a reflex.

Other Elements We've Tried

  • XP/Leveling: Track cumulative problems solved across sessions
  • Titles: "Statistics Apprentice" โ†’ "Data Detective" โ†’ "Analysis Wizard"
  • Confidence bets: Students predict if they're right before seeing feedback
  • Streak bonuses: Consecutive correct answers unlock fun facts or "Easter eggs"
Caveat

Gamification adds complexity and can feel forced if it doesn't match your teaching style. Start simple. Add elements only if students respond well.

๐Ÿ’ก Adapt, Don't Copy

Everything here reflects our teaching styles, our courses, and our students. Your context is different. Take what resonates, ignore what doesn't, and build something that works for you.

The best bot is one that feels like a natural extension of how you already teach โ€” not a replacement for your judgment or your relationship with students.

Questions?

If you're at our institution and want to see these in action or talk through your own bot ideas, find us at AI Office Hours or just email. Happy to share prompt files directly.