AI in the Classroom: A Human-Centered Framework

Key insights
- AI should replace busy work, not educators. Teachers shouldn't spend 20 hours writing a lesson plan when AI can generate one in under two seconds.
- Two students failing high school while juggling family responsibilities saw their grades skyrocket within 60-90 days after learning to use custom AI tools.
- The achievement gap is the central question: will AI expand inequality or close it? Blake argues the answer depends entirely on who gets access and how it's taught.
This is an AI-generated summary. The source video includes demos, visuals and context not covered here. Watch the video โ ยท How our articles are made โ
Read this article in Norwegian
In Brief
Dr. William L. Blake, a veteran educator and author of The AI School Leader, used to fear AI the same way many teachers do: as another thing added to an already impossible workload. That changed when two of his students, Dante Butler and Robert Green, showed him that AI could be a tool for opportunity, not just a threat. In this TEDx talk at Morgan State University, Blake presents a three-principle framework for educators who want to use AI to save time, strengthen relationships, and close the achievement gap. For related reading, see Alpha School: AI Tutors Teaching Kids 10x Faster, Why Banning AI in Classrooms Misses the Point, and From Estonia to India: Schools Bet Big on AI.
What is the Human-Centered Framework for AI in Education?
The Human-Centered Framework for AI in Education is a three-principle approach designed to help teachers and school leaders adopt AI without losing what makes education meaningful: human connection. Developed by Dr. Blake across 21 years in K-12 schools, the framework answers a question educators are quietly asking: will AI help my students, or will it make things worse for the ones who are already struggling?
The framework rests on three principles. AI should support educators, not replace them. AI should help build relationships rather than erode them. And AI should be used to level the playing field for students who have been left behind.
How it works
The turning point: Dante and Rob
Before laying out the framework, Blake describes the moment that changed his thinking. Two of his students, Dante Butler and Robert Green, were straight-A students in middle school. By high school, they were failing every class (3:29). When Blake asked what happened, their answer was the same: they were not just students. They were providers. At 14 and 15 years old, they were responsible for making sure siblings got to school, food was on the table, and rent was paid (4:07).
Blake sat them down and taught them about GPT, short for Generative Pre-trained Transformer โ an AI model trained on large amounts of text that can generate human-like responses to questions and tasks (9:07). He showed them how to build a custom GPT configured specifically for their situation, and how to write prompts: the instructions you give an AI to get a useful result.
One of the first things they tried: asking the GPT to create a daily schedule based on their actual circumstances. The AI had a plan ready in 20 seconds (10:02). Within 60 to 90 days, their grades had skyrocketed (10:30).
Principle 1: Support, not replace
The first principle is that AI should replace busy work, not educators (11:00). Blake is direct about this: no teacher should spend 20 hours writing a lesson plan (11:32).
He gives a concrete example of what a well-constructed prompt looks like. A ninth-grade biology teacher in Washington, D.C. could write: "I'm a ninth grade biology teacher. Can you create a lesson for African-American students in DC aligned to the Next Generation Science Standards? Include an opening activity, one engaging exercise, and an exit ticket so I can check for understanding. Also give me a five-minute speech I can deliver for a 55-minute class, leaving 45 minutes for student engagement." (11:55)
That prompt produces a complete lesson plan in under two seconds (13:00). The teacher's job then is to bring it to life, to adapt it for the students in front of them. AI handles the scaffolding. The educator handles the relationship.
Principle 2: Build relationships
The second principle is that AI should free up time for human connection, not crowd it out (13:15). Blake illustrates this with classroom observations, a core part of a school leader's job.
Traditionally, he would walk into a classroom with a laptop or iPad, typing notes while trying to pay attention to what was happening. The technology got in the way of actually watching the teacher and students. Now he uses AI-enabled glasses that record the observation automatically (13:47). When the observation is over, he asks the glasses to align everything to his evaluation framework. The formal observation report is ready before he leaves the room (14:27).
The same principle applies to post-observation meetings with teachers. Before AI, both parties would be staring at their own screens, reading notes, and there would be very little actual conversation. Now Blake records the conversation, stays fully present during it, and then asks the AI to summarize the discussion and identify three pieces of high-impact feedback aligned to the evaluation system. The feedback loop closes before the teacher walks out the door.
Principle 3: Level the playing field
The third principle addresses what Blake calls the central question: will AI expand the achievement gap or close it? (1:49) His answer depends entirely on whether schools make intentional choices about access.
He focuses on multilingual learners, students whose primary language is not English, as a concrete example. In many U.S. classrooms, these students are growing to be the majority, but most teachers only teach in one language (15:45). A teacher can use AI to generate a lesson in a student's home language while weaving in English. The student learns new material in a familiar language while also building English fluency (16:10). No specialist required. No additional budget.
This is the equity argument at its clearest: AI can put tools in the hands of teachers who previously had no way to reach every student in the room.
Common misconceptions
"AI in schools is just cheating"
The framing of AI as cheating tends to stop the conversation before it starts. Blake acknowledges this is the most common reaction he encounters in professional development sessions, and he argues it reflects a fixed mindset that puts the brakes on innovation right when schools need it most (4:33). Using a calculator to solve math problems is not cheating. Using a dictionary while writing is not cheating. The question is not whether a tool was used, but whether the student learned something.
"AI will replace teachers"
Blake is clear: AI will not replace educators if educators choose not to let it. What AI can replace is the time spent on tasks that do not require human judgment: writing schedule templates, generating lesson frameworks, formatting observation reports. That time can go back into actually teaching, building relationships, and paying attention to students who need it. The risk of replacement is real only if schools hand the wheel entirely to technology and remove the human in the loop.
Practical implications
For classroom teachers
The most immediate takeaway is the lesson-planning prompt. A detailed, specific instruction takes about two minutes to write and produces a usable starting point in under two seconds. Include the grade level, subject, student context, curriculum standards, and what you want the lesson to include. The prompt is free. The savings in Sunday-night stress are real.
For school leaders and administrators
The observation workflow Blake describes shows how AI can eliminate the friction that gets in the way of meaningful feedback. If administrators spend less time formatting notes and more time having actual conversations with teachers, the quality of professional development improves. This requires school leaders to model AI use, not just mandate it.
For education policy and equity advocates
The achievement gap question has no automatic answer. AI can widen inequality if only well-resourced schools and students get access. It can close the gap if schools in under-resourced communities are given the tools and training to use it effectively. The framework itself is a starting point, but access and professional development are prerequisites.
Glossary
| Term | Definition |
|---|---|
| GPT (Generative Pre-trained Transformer) | An AI model trained on large amounts of text that can generate human-like responses. ChatGPT is the most widely known example. |
| Custom GPT | A personalized version of ChatGPT configured for specific tasks. You define its name, instructions, and what it should focus on. |
| Prompt | The instruction you give an AI. The more specific and contextual your prompt, the more useful the output. |
| Prompt engineering | The practice of writing effective prompts to get better results from AI. It is a skill that improves with practice. |
| Achievement gap | The disparity in academic performance between different student groups, often along racial and economic lines. |
| K-12 education | The US school system from kindergarten through 12th grade, covering students roughly ages 5 to 18. |
| Multilingual learners | Students whose primary language is not the dominant classroom language. In the US, this often means students whose first language is not English. |
| Human-centered AI framework | An approach that keeps human needs and relationships at the center of AI adoption, rather than treating AI as a replacement for human judgment. |
Sources and resources
Want to go deeper? Watch the full video on YouTube โ