Sixty-five percent of educators use AI to bridge resource gaps, even as platform fatigue and a lack of system integration threaten productivity, according to Jotform‘s EdTech Trends 2026 report.
Based on a survey of 50 K-12 and higher education professionals, the report reveals a resilient workforce looking for ways to combat the effects of significant budget cuts and burnout. The respondents were teachers, instructors, and professors split about equally between higher education and K-12.
While 56 percent of educators are “very concerned” over recent cuts to U.S. education infrastructure, 65 percent are now actively using AI. Of those using AI, nearly half (48 percent) use it for both student learning and administrative tasks, such as summarizing long documents and automating feedback.
“We conducted this survey to better understand the pain points educators have with technology,” says Lainie Johnson, director of enterprise marketing at Jotform. “We were surprised that our respondents like their tech tools so much. Because while the tools themselves are great, their inability to work together causes a problem.”
Key findings from the EdTech Trends 2026 report include:
The integration gap: Although 77 percent of educators say their current digital tools work well, 73 percent cite a “lack of integration between systems” as their primary difficulty. “The No. 1 thing I would like for my digital tools to do is to talk to each other,” one respondent noted. “I feel like often we have to jump from one platform to another just to get work done.”
Platform fatigue: Educators are managing an average of eight different digital tools, with 50 percent reporting they are overwhelmed by “too many platforms.”
The burden of manual tasks: Despite the many digital tools they use, educators spend an average of seven hours per week on manual tasks.
AI for productivity: Fifty-eight percent of respondents use AI most frequently as a productivity tool for research, brainstorming, and writing.
Data security and ethics: Ethical implications and data security are the top concerns for educators when implementing AI.
eSchool Media staff cover education technology in all its aspects–from legislation and litigation, to best practices, to lessons learned and new products. First published in March of 1998 as a monthly print and digital newspaper, eSchool Media provides the news and information necessary to help K-20 decision-makers successfully use technology and innovation to transform schools and colleges and achieve their educational goals.
AI has crossed a threshold. In 2026, it is no longer a pilot category or a differentiator you add on. It is part of the operating fabric of education, embedded in how learning experiences are created, how learners practice, how educators respond, and how outcomes are measured. That reality changes the product design standard.
The strategic question is not, “Do we have AI embedded in the learning product design or delivery?” It is, “Can we prove AI is improving outcomes reliably, safely, and at scale?”
That proof now matters to everyone. Education leaders face accountability pressure. Institutions balance outcomes and budgets. Publishers must defend program impact. CTE providers are tasked with career enablement that is real, not implied. This is the shift from hype to efficacy. Efficacy is not a slogan. It is a product discipline.
What the 2026 efficacy imperative actually means
Efficacy is the chain that connects intent to impact: mastery, progression, completion, and readiness. In CTE and career pathways, readiness includes demonstrated performance in authentic tasks such as troubleshooting, communication, procedural accuracy, decision-making, and safe execution, not just quiz scores.
The product design takeaway is simple. Treat efficacy as a first-class product requirement. That means clear success criteria, instrumentation, governance, and a continuous improvement loop. If you cannot answer what improved, for whom, and under what conditions, your AI strategy is not a strategy. It is a list of features.
Below is practical guidance you can apply immediately.
1. Start with outcomes, then design the AI
A common mistake is shipping capabilities in search of purpose. Chat interfaces, content generation, personalization, and automated feedback can all be useful. Utility is not efficacy.
Guidance Anchor your AI roadmap in a measurable outcome statement, then work backward.
Define the outcome you want to improve (mastery, progression, completion, readiness).
Define the measurable indicators that represent that outcome (signals and thresholds).
Design the AI intervention that can credibly move those indicators.
Instrument the experience so you can attribute lift to the intervention.
Iterate based on evidence, not excitement.
Takeaways for leaders If your roadmap is organized as “features shipped,” you will struggle to prove impact. A mature roadmap reads as “outcomes moved” with clarity on measurement, scope, and tradeoffs.
2. Make CTE and career enablement measurable and defensible
Career enablement is the clearest test of value in education. Learners want capability, educators want rigor with scalability, and employers want confidence that credentials represent real performance.
CTE makes this pressure visible. It is also where AI can either elevate programs or undermine trust if it inflates claims without evidence.
Guidance Focus AI on the moments that shape readiness.
Competency-based progression must be operational, not aspirational. Competencies should be explicit, observable, and assessable. Outcomes are not “covered.” They are verified.
Applied practice must be the center. Scenarios, simulations, troubleshooting, role plays, and procedural accuracy are where readiness is built.
Assessment credibility must be protected. Blueprint alignment, difficulty control, and human oversight are non-negotiable in high-stakes workflows.
Takeaways for leaders A defensible career enablement claim is simple. Learners show measurable improvement on authentic tasks aligned to explicit competencies with consistent evaluation. If your program cannot demonstrate that, it is vulnerable, regardless of how polished the AI appears.
3. Treat platform decisions as product strategy decisions
Many AI initiatives fail because the underlying platform cannot support consistency, governance, or measurement.
If AI is treated as a set of features, you can ship quickly and move on. If AI is a commitment to efficacy, your platform must standardize how AI is used, govern variability, and measure outcomes consistently.
Guidance Build a platform posture around three capabilities.
Standardize the AI patterns that matter. Define reusable primitives such as coaching, hinting, targeted practice, rubric based feedback, retrieval, summarization, and escalation to humans. Without standardization, quality varies, and outcomes cannot be compared.
Govern variability without slowing delivery. Put model and prompt versioning, policy constraints, content boundaries, confidence thresholds, and required human decision points in the platform layer.
Measure once and learn everywhere. Instrumentation should be consistent across experiences so you can compare cohorts, programs, and interventions without rebuilding analytics each time.
Takeaways for leaders Platform is no longer plumbing. In 2026, the platform is the mechanism that makes efficacy scalable and repeatable. If your platform cannot standardize, govern, and measure, your AI strategy will remain fragmented and hard to defend.
4. Build tech-assisted measurement into the daily operating loop
Efficacy cannot be a quarterly research exercise. It must be continuous, lightweight, and embedded without turning educators into data clerks.
Guidance Use a measurement architecture that supports decision-making.
Define a small learning event vocabulary you can trust. Examples include attempt, error type, hint usage, misconception flag, scenario completion, rubric criterion met, accommodation applied, and escalation triggered. Keep it small and consistent.
Use rubric-aligned evaluation for applied work. Rubrics are the bridge between learning intent and measurable performance. AI can assist by pre scoring against criteria, highlighting evidence, flagging uncertainty, and routing edge cases to human review.
Link micro signals to macro outcomes. Tie practice behavior to mastery, progression, completion, assessment performance, and readiness indicators so you can prioritize investments and retire weak interventions.
Enable safe experimentation. Use controlled rollouts, cohort selection, thresholds, and guardrails so teams can test responsibly and learn quickly without breaking trust.
Takeaways for leaders If you cannot attribute improvement to a specific intervention and measure it continuously, you will drift into reporting usage rather than proving impact. Usage is not efficacy.
5. Treat accessibility as part of efficacy, not compliance overhead
An AI system that works for only some learners is not effective. Accessibility is now a condition of efficacy and a driver of scale.
Guidance Bake accessibility into AI-supported experiences.
Ensure structure and semantics, keyboard support, captions, audio description, and high-quality alt text.
Validate compatibility with assistive technologies.
Measure efficacy across learner groups rather than averaging into a single headline.
Takeaways for leaders Inclusive design expands who benefits from AI-supported practice and feedback. It improves outcomes while reducing risk. Accessibility should be part of your efficacy evidence, not a separate track.
The 2026 Product Design and Strategy checklist
If you want AI to remain credible in your product and program strategy, use these questions as your executive filter:
Can we show measurable improvement in mastery, progression, completion, and readiness that is attributable to AI interventions, not just usage?
Are our CTE and career enablement claims traceable to explicit competencies and authentic performance tasks?
Is AI governed with clear boundaries, human oversight, and consistent quality controls?
Do we have platform level patterns that standardize experiences, reduce variance, and instrument outcomes?
Is measurement continuous and tech-assisted, built for learning loops rather than retrospective reporting?
Do we measure efficacy across learner groups to ensure accessibility and equity in impact?
Rishi Raj Gera, Magic Edtech
Rishi Raj Gera is Chief Solutions Officer at Magic Edtech. Rishi brings over two decades of experience in designing digital learning systems that sit at the intersection of accessibility, personalization, and emerging technology. His work is driven by a consistent focus on building educational systems that adapt to individual learner needs while maintaining ethical boundaries and equity in design. Rishi continues to advocate for learning environments that are as human-aware as they are data-smart, especially in a time when technology is shaping how students engage with knowledge and one another.
Latest posts by eSchool Media Contributors (see all)
It’s truly incredible how much new technology has made its way into the classroom. Where once teaching consisted primarily of whiteboards and textbooks, you can now find tablets, smart screens, AI assistants, and a trove of learning apps designed to foster inquiry and maximize student growth.
While these new tools are certainly helpful, the flood of options means that educators can struggle to discern truly useful resources from one-time gimmicks. As a result, some of the best tools for sparking curiosity, creativity, and critical thinking often go overlooked.
Personally, I believe 3D printing is one such tool that doesn’t get nearly enough consideration for the way it transforms a classroom.
3D printing is the process of making a physical object from a three-dimensional digital model, typically by laying down many thin layers of material using a specialized printer. Using 3D printing, a teacher could make a model of a fossil to share with students, trophies for inter-class competitions, or even supplies for construction activities.
At first glance, this might not seem all that revolutionary. However, 3D printing offers three distinct educational advantages that have the potential to transform K–12 learning:
It develops success skills: 3D printing encourages students to build a variety of success skills that prepare them for challenges outside the classroom. For starters, its inclusion creates opportunities for students to practice communication, collaboration, and other social-emotional skills. The process of moving from an idea to a physical, printed prototype fosters perseverance and creativity. Meanwhile, every print–regardless of its success–builds perseverance and problem-solving confidence. This is the type of hands-on, inquiry-based learning that students remember.
It creates cross-curricular connections: 3D printing is intrinsically cross-curricular. Professional scientists, engineers, and technicians often use 3D printing to create product models or build prototypes for testing their hypotheses. This process involves documentation, symbolism, color theory, understanding of narrative, and countless other disciplines. It doesn’t take much imagination to see how these could also be beneficial to classroom learning. Students can observe for themselves how subjects connect, while teachers transform abstract concepts into tangible points of understanding.
It’s aligned with engineering and NGSS: 3D printing aligns perfectly with Next Gen Science Standards. By focusing on the engineering design process (define, imagine, plan, create, improve) students learn to think and act like real scientists to overcome obstacles. This approach also emphasizes iteration and evidence-based conclusions. What better way to facilitate student engagement, hands-on inquiry, and creative expression?
3D printing might not be the flashiest educational tool, but its potential is undeniable. This flexible resource can give students something tangible to work with while sparking wonder and pushing them to explore new horizons.
So, take a moment to familiarize yourself with the technology. Maybe try running a few experiments of your own. When used with purpose, 3D printing transforms from a common classroom tool into a launchpad for student discovery.
Jon Oosterman, Van Andel Institute for Education
Jon Oosterman is a Learning Specialist at Van Andel Institute for Education, a Michigan-based education nonprofit dedicated to creating classrooms where curiosity, creativity, and critical thinking thrive.
Latest posts by eSchool Media Contributors (see all)
Many years ago, around 2010, I attended a professional development program in Houston called Literacy Through Photography, at a time when I was searching for practical ways to strengthen comprehension, discussion, and reading fluency, particularly for students who found traditional print-based tasks challenging. As part of the program, artists visited my classroom and shared their work with students. Much of that work was abstract. There were no obvious answers and no single “correct” interpretation.
Instead, students were invited to look closely, talk together, and explain what they noticed.
What struck me was how quickly students, including those who struggled with traditional reading tasks, began to engage. They learned to slow down, describe what they saw, make inferences, and justify their thinking. They weren’t just looking at images; they were reading them. And in doing so, they were rehearsing many of the same strategies we expect when reading written texts.
At the time, this felt innovative. But it also felt deeply intuitive.
Fast forward to today.
Students are surrounded by images and videos, from photographs and diagrams to memes, screenshots, and, increasingly, AI-generated visuals. These images appear everywhere: in learning materials, on social media, and inside the tools students use daily. Many look polished, realistic, and authoritative.
At the same time, AI has made faking easier than ever.
As educators and school leaders, we now face urgent questions around misinformation, academic integrity, and critical thinking. The issue is no longer just whether students can use AI tools, but whether they can interpret, evaluate, and question what they see.
This is where visual literacy becomes a frontline defence.
Teaching students to read images critically, to see them as constructed texts rather than neutral data, strengthens the same skills we rely on for strong reading comprehension: inference, evidence-based reasoning, and metacognitive awareness.
From photography to AI: A conversation grounded in practice
Recently, I found myself returning to those early classroom experiences through ongoing professional dialogue with a former college lecturer and professional photographer, as we explored what it really means to read images in the age of AI.
A conversation that grew out of practice
Nesreen: When I shared the draft with you, you immediately focused on the language, whether I was treating images as data or as signs. Is this important?
Photographer: Yes, because signs belong to reading. Data is output. Signs are meaning. When we talk about reading media texts, we’re talking about how meaning is constructed, not just what information appears.
Nesreen: That distinction feels crucial right now. Students are surrounded by images and videos, but they’re rarely taught to read them with the same care as written texts.
Photographer: Exactly. Once students understand that photographs and AI images are made up of signs, color, framing, scale, and viewpoint, they stop treating images as neutral or factual.
Nesreen: You also asked whether the lesson would lean more towards evaluative assessment or summarizing. That made me realize the reflection mattered just as much as the image itself.
Photographer: Reflection is key. When students explain why a composition works, or what they would change next time, they’re already engaging in higher-level reading skills.
Nesreen: And whether students are analyzing a photograph, generating an AI image, or reading a paragraph, they’re practicing the same habits: slowing down, noticing, justifying, and revising their thinking.
Photographer: And once they see that connection, reading becomes less about the right answer and more about understanding how meaning is made.
Reading images is reading
One common misconception is that visual literacy sits outside “real” literacy. In practice, the opposite is true.
When students read images carefully, they:
identify what matters most
follow structure and sequence
infer meaning from clues
justify interpretations with evidence
revise first impressions
These are the habits of skilled readers.
For emerging readers, multilingual learners, and students who struggle with print, images lower the barrier to participation, without lowering the cognitive demand. Thinking comes first. Language follows.
From composition to comprehension: Mapping image reading to reading strategies
Photography offers a practical way to name what students are already doing intuitively. When teachers explicitly teach compositional elements, familiar reading strategies become visible and transferable.
What students notice in an image
What they are doing cognitively
Reading strategy practiced
Where the eye goes first
Deciding importance
Identifying main ideas
How the eye moves
Tracking structure
Understanding sequence
What is included or excluded
Considering intention
Analyzing author’s choices
Foreground and background
Sorting information
Main vs supporting details
Light and shadow
Interpreting mood
Making inferences
Symbols and colour
Reading beyond the literal
Figurative language
Scale and angle
Judging power
Perspective and viewpoint
Repetition or pattern
Spotting themes
Theme identification
Contextual clues
Using surrounding detail
Context clues
Ambiguity
Holding multiple meanings
Critical reading
Evidence from the image
Justifying interpretation
Evidence-based responses
Once students recognise these moves, teachers can say explicitly:
“You’re doing the same thing you do when you read a paragraph.”
That moment of transfer is powerful.
Making AI image generation teachable (and safe)
In my classroom work pack, students use Perchance AI to generate images. I chose this tool deliberately: It is accessible, age-appropriate, and allows students to iterate, refining prompts based on compositional choices rather than chasing novelty.
Students don’t just generate an image once. They plan, revise, and evaluate.
This shifts AI use away from shortcut behavior and toward intentional design and reflection, supporting academic integrity rather than undermining it.
The progression of a prompt: From surface to depth (WAGOLL)
One of the most effective elements of the work pack is a WAGOLL (What A Good One Looks Like) progression, which shows students how thinking improves with precision.
Simple: A photorealistic image of a dog sitting in a park.
Secure: A photorealistic image of a dog positioned using the rule of thirds, warm colour palette, soft natural lighting, blurred background.
Greater Depth: A photorealistic image of a dog positioned using the rule of thirds, framed by tree branches, low-angle view, strong contrast, sharp focus on the subject, blurred background.
Students can see and explain how photographic language turns an image from output into meaningful signs. That explanation is where literacy lives.
When classroom talk begins to change
Over time, classroom conversations shift.
Instead of “I like it” or “It looks real,” students begin to say:
“The creator wants us to notice…”
“This detail suggests…”
“At first I thought…, but now I think…”
These are reading sentences.
Because images feel accessible, more students participate. The classroom becomes slower, quieter, and more thoughtful–exactly the conditions we want for deep comprehension.
Visual literacy as a bridge, not an add-on
Visual literacy is not an extra subject competing for time. It is a bridge, especially in the age of AI.
By teaching students how to read images, schools strengthen:
reading comprehension
inference and evaluation
evidence-based reasoning
metacognitive awarenes
Most importantly, students learn that literacy is not about rushing to answers, but about noticing, questioning, and constructing meaning.
In a world saturated with AI-generated images, teaching students how to read visually is no longer optional.
It is literacy.
Author’s note: This article grew out of classroom practice and professional dialogue with a former college lecturer and professional photographer. Their contribution informed the discussion of visual composition, semiotics, and reflective image-reading, without any involvement in publication or authorship.
Nesreen El-Baz, Bloomsbury Education Author & School Governor
Nesreen El-Baz is an ESL educator with over 20 years of experience, and is a certified bilingual teacher with a Master’s in Curriculum and Instruction. El-Baz is currently based in the UK, holds a Masters degree in Curriculum and Instruction from Houston Christian University, and specializes in developing in innovative strategies for English Learners and Bilingual education.
Latest posts by eSchool Media Contributors (see all)
Nesreen El-Baz, Bloomsbury Education Author & School Governor
For the last two years, conversations about AI in education have tended to fall into two camps: excitement about efficiency or fear of replacement. Teachers worry they’ll lose authenticity. Leaders worry about academic integrity. And across the country, schools are trying to make sense of a technology that feels both promising and overwhelming.
But there’s a quieter, more human-centered opportunity emerging–one that rarely makes the headlines: AI can actually strengthen empathy and improve the quality of our interactions with students and staff.
Not by automating relationships, but by helping us become more reflective, intentional, and attuned to the people we serve.
As a middle school assistant principal and a higher education instructor, I’ve found that AI is most valuable not as a productivity tool, but as a perspective-taking tool. When used thoughtfully, it supports the emotional labor of teaching and leadership–the part of our work that cannot be automated.
From efficiency to empathy
Schools do not thrive because we write faster emails or generate quicker lesson plans. They thrive because students feel known. Teachers feel supported. Families feel included.
AI can assist with the operational tasks, but the real potential lies in the way it can help us:
Reflect on tone before hitting “send” on a difficult email
Understand how a message may land for someone under stress
Role-play sensitive conversations with students or staff
Anticipate barriers that multilingual families might face
Rehearse a restorative response rather than reacting in the moment
These are human actions–ones that require situational awareness and empathy. AI can’t perform them for us, but it can help us practice and prepare for them.
A middle school use case: Preparing for the hard conversations
Middle school is an emotional ecosystem. Students are forming identity, navigating social pressures, and learning how to advocate for themselves. Staff are juggling instructional demands while building trust with young adolescents whose needs shift by the week.
Some days, the work feels like equal parts counselor, coach, and crisis navigator.
One of the ways I’ve leveraged AI is by simulating difficult conversations before they happen. For example:
A student is anxious about returning to class after an incident
A teacher feels unsupported and frustrated
A family is confused about a schedule change or intervention plan
By giving the AI a brief description and asking it to take on the perspective of the other person, I can rehearse responses that center calm, clarity, and compassion.
This has made me more intentional in real interactions–I’m less reactive, more prepared, and more attuned to the emotions beneath the surface.
Empathy improves when we get to “practice” it.
Supporting newcomers and multilingual learners
Schools like mine welcome dozens of newcomers each year, many with interrupted formal education. They bring extraordinary resilience–and significant emotional and linguistic needs.
AI tools can support staff in ways that deepen connection, not diminish it:
Drafting bilingual communication with a softer, more culturally responsive tone
Helping teachers anticipate trauma triggers based on student histories
Rewriting classroom expectations in family-friendly language
Generating gentle scripts for welcoming a student experiencing culture shock
The technology is not a substitute for bilingual staff or cultural competence. But it can serve as a bridge–helping educators reach families and students with more warmth, clarity, and accuracy.
When language becomes more accessible, relationships strengthen.
AI as a mirror for leadership
One unexpected benefit of AI is that it acts as a mirror. When I ask it to review the clarity of a communication, or identify potential ambiguities, it often highlights blind spots:
“This sentence may sound punitive.”
“This may be interpreted as dismissing the student’s perspective.”
“Consider acknowledging the parent’s concern earlier in the message.”
These are the kinds of insights reflective leaders try to surface–but in the rush of a school day, they are easy to miss.
AI doesn’t remove responsibility; it enhances accountability. It helps us lead with more emotional intelligence, not less.
What this looks like in teacher practice
For teachers, AI can support empathy in similarly grounded ways:
1. Building more inclusive lessons
Teachers can ask AI to scan a lesson for hidden barriers–assumptions about background knowledge, vocabulary loads, or unclear steps that could frustrate students.
2. Rewriting directions for struggling learners
A slight shift in wording can make all the difference for a student with anxiety or processing challenges.
3. Anticipating misconceptions before they happen
AI can run through multiple “student responses” so teachers can see where confusion might arise.
4. Practicing restorative language
Teachers can try out scripts for responding to behavioral issues in ways that preserve dignity and connection.
These aren’t shortcuts. They’re tools that elevate the craft.
Human connection is the point
The heart of education is human. AI doesn’t change that–in fact, it makes it more obvious.
When we reduce the cognitive load of planning, we free up space for attunement. When we rehearse hard conversations, we show up with more steadiness. When we write in more inclusive language, more families feel seen. When we reflect on our tone, we build trust.
The goal isn’t to create AI-enhanced classrooms. It’s to create relationship-centered classrooms where AI quietly supports the skills that matter most: empathy, clarity, and connection.
Schools don’t need more automation.
They need more humanity–and AI, used wisely, can help us get there.
Timothy Montalvo, Iona University & the College of Westchester
Timothy Montalvo is a middle school educator and leader passionate about leveraging technology to enhance student learning. He serves as Assistant Principal at Fox Lane Middle School in Westchester, NY, and teaches education courses as an adjunct professor at Iona University and the College of Westchester. Montalvo focuses on preparing students to be informed, active citizens in a digital world and shares insights on Twitter/X @MrMontalvoEDU or on BlueSky @montalvoedu.bsky.social.
Latest posts by eSchool Media Contributors (see all)
Timothy Montalvo, Iona University & the College of Westchester
The rapid rise of generative AI has turned classrooms into a real-time experiment in technology use. Students are using AI to complete assignments, while teachers are leveraging it to design lessons, streamline grading, and manage administrative tasks.
According to new national survey data from RAND, AI use among both students and educators has grown sharply–by more than 15 percentage points in just the past one to two years. Yet, training and policy have not kept pace. Schools and districts are still developing professional development, student guidance, and clear usage policies to manage this shift.
As a result, educators, students, and parents are navigating both opportunities and concerns. Students worry about being falsely accused of cheating, and many families fear that increased reliance on AI could undermine students’ critical thinking skills.
Key findings:
During the 2024-2025 school year, AI saw rapid growth.
AI use in schools surged during the 2024-2025 academic year. By 2025, more than half of students (54 percent) and core subject teachers (53 percent) were using AI for schoolwork or instruction–up more than 15 points from just a year or two earlier. High school students were the most frequent users, and AI adoption among teachers climbed steadily from elementary to high school.
While students and parents express significant concern about the potential downsides of AI, school district leaders are far less worried.
Sixty-one percent of parents, 48 percent of middle school students, and 55 percent of high school students believe that increased use of AI could harm students’ critical-thinking skills, compared with just 22 percent of district leaders. Additionally, half of students said they worry about being falsely accused of using AI to cheat.
Training and policy development have not kept pace with AI use in schools.
By spring 2025, only 35 percent of district leaders said their schools provide students with training on how to use AI. Meanwhile, more than 80 percent of students reported that their teachers had not explicitly taught them how to use AI for schoolwork. Policy guidance also remains limited–just 45 percent of principals said their schools or districts have policies on AI use, and only 34 percent of teachers reported policies specifically addressing academic integrity and AI.
The report offers recommendations around AI use and guidance:
As AI technology continues to evolve, trusted sources–particularly state education agencies–should provide consistent, regularly updated guidance on effective AI policies and training. This guidance should help educators and students understand how to use AI as a complement to learning, not a replacement for it.
District and school leaders should clearly define what constitutes responsible AI use versus academic dishonesty and communicate these expectations to both teachers and students. In the near term, educators and students urgently need clarity on what qualifies as cheating with AI.
Elementary schools should also be included in this effort. Nearly half of elementary teachers are already experimenting with AI, and these early years are when students build foundational skills and habits. Providing age-appropriate, coherent instruction about AI at this stage can reduce misuse and confusion as students progress through school and as AI capabilities expand.
Ultimately, district leaders should develop comprehensive AI policies and training programs that equip teachers and students to use AI productively and ethically across grade levels.
Laura Ascione is the Editorial Director at eSchool Media. She is a graduate of the University of Maryland’s prestigious Philip Merrill College of Journalism.
If you’ve attended a professional show or musical recently, chances are you’ve seen virtual set design in action. This approach to stage production has gained so much traction it’s now a staple in the industry. After gaining momentum in professional theater, it has made its way into collegiate performing arts programs and is now emerging in K-12 productions as well.
Virtual set design offers a modern alternative to traditional physical stage sets, using technology and software to create immersive backdrops and environments. This approach unlocks endless creative possibilities for schools while also providing practical advantages.
Here, I’ll delve into three key benefits: increasing student engagement and participation, improving efficiency and flexibility in productions, and expanding educational opportunities.
Increasing student engagement and participation
Incorporating virtual set design into productions gets students excited about learning new skills while enhancing the storytelling of a show. When I first joined Churchill High School in Livonia, Michigan as the performing arts manager, the first show we did was Shrek the Musical, and I knew it would require an elaborate set. While students usually work together to paint the various backdrops that bring the show to life, I wanted to introduce them to collaborating on virtual set design.
We set up Epson projectors on the fly rail and used them to project images as the show’s backdrops. Positioned at a short angle, the projectors avoided any shadowing on stage. To create a seamless image with both projectors, we utilized edge-blending and projection mapping techniques using just a Mac® laptop and QLab software. Throughout the performance, the projectors transformed the stage with a dozen dynamic backdrops, shifting from a swamp to a castle to a dungeon.
Students were amazed by the technology and very excited to learn how to integrate it into the set design process. Their enthusiasm created a real buzz around the production, and the community’s feedback on the final results were overwhelmingly positive.
Improving efficiency and flexibility
During Shrek the Musical, there were immediate benefits that made it so much easier to put together a show. To start, we saved money by eliminating the need to build multiple physical sets. While we were cutting costs on lumber and materials, we were also solving design challenges and expanding what was possible on stage.
This approach also saved us valuable time. Preparing the sets in the weeks leading up to the show was faster, and transitions during performances became seamless. Instead of moving bulky scenery between scenes or acts, the stage crew simply switched out projected images making it much more efficient.
We saw even more advantages in our spring production of She Kills Monsters. Some battle scenes called for 20 or 30 actors to be on stage at once, which would have been difficult to manage with a traditional set. By using virtual production, we broke the stage up with different panels spaced apart and projected designs, creating more space for performers. We were able to save physical space, as well as create a design that helped with stage blocking and made it easier for students to find their spots.
Since using virtual sets, our productions have become smoother, more efficient, and more creative.
Expanding educational opportunities
Beyond the practical benefits, virtual set design also creates valuable learning opportunities for students. Students involved in productions gain exposure to industry-level technology and learn about careers in the arts, audio, and video technology fields. Introducing students to these opportunities before graduating high school can really help prepare them for future success.
Additionally, in our school’s technical theater courses, students are learning lessons on virtual design and gaining hands-on experiences. As they are learning about potential career paths, they are developing collaboration skills and building transferable skills that directly connect to college and career readiness.
Looking ahead with virtual set design
Whether students are interested in graphic design, sound engineering, or visual technology, virtual production brings countless opportunities to them to explore. It allows them to experiment with tools and concepts that connect directly to potential college majors or future careers.
For schools, incorporating virtual production into high school theater offers more than just impressive shows. It provides a cost-effective, flexible, and innovative approach to storytelling. It is a powerful tool that benefits productions, enriches student learning, and prepares the next generation of artists and innovators.
Latest posts by eSchool Media Contributors (see all)
Jared Cole, Churchill High School, Livonia Public Schools
When I first started experimenting with AI in my classroom, I saw the same thing repeatedly from students. They treated it like Google. Ask a question, get an answer, move on. It didn’t take long to realize that if my students only engage with AI this way, they miss the bigger opportunity to use AI as a partner in thinking. AI isn’t a magic answer machine. It’s a tool for creativity and problem-solving. The challenge for us as educators is to rethink how we prepare students for the world they’re entering and to use AI with curiosity and fidelity.
Moving from curiosity to fluency
In my district, I wear two hats: history teacher and instructional coach. That combination gives me the space to test ideas in the classroom and support colleagues as they try new tools. What I’ve learned is that AI fluency requires far more than knowing how to log into a platform. Students need to learn how to question outputs, verify information and use results as a springboard for deeper inquiry.
I often remind them, “You never trust your source. You always verify and compare.” If students accept every AI response at face value, they’re not building the critical habits they’ll need in college or in the workforce.
To make this concrete, I teach my students the RISEN framework: Role, Instructions, Steps, Examples, Narrowing. It helps them craft better prompts and think about the kind of response they want. Instead of typing “explain photosynthesis,” they might ask, “Act as a biologist explaining photosynthesis to a tenth grader. Use three steps with an analogy, then provide a short quiz at the end.” Suddenly, the interaction becomes purposeful, structured and reflective of real learning.
AI as a catalyst for equity and personalization
Growing up, I was lucky. My mom was college educated and sat with me to go over almost every paper I wrote. She gave me feedback that helped to sharpen my writing and build my confidence. Many of my students don’t have that luxury. For these learners, AI can be the academic coach they might not otherwise have.
That doesn’t mean AI replaces human connection. Nothing can. But it can provide feedback, ask guiding questions, and provide examples that give students a sounding board and thought partner. It’s one more way to move closer to providing personalized support for learners based on need.
Of course, equity cuts both ways. If only some students have access to AI or if we use it without considering its bias, we risk widening the very gaps we hope to close. That’s why it’s our job as educators to model ethical and critical use, not just the mechanics.
Shifting how we assess learning
One of the biggest shifts I’ve made is rethinking how I assess students. If I only grade the final product, I’m essentially inviting them to use AI as a shortcut. Instead, I focus on the process: How did they engage with the tool? How did they verify and cross-reference results? How did they revise their work based on what they learned? What framework guided their inquiry? In this way, AI becomes part of their learning journey rather than just an endpoint.
I’ve asked students to run the same question through multiple AI platforms and then compare the outputs. What were the differences? Which response feels most accurate or useful? What assumptions might be at play? These conversations push students to defend their thinking and use AI critically, not passively.
Navigating privacy and policy
Another responsibility we carry as educators is protecting our students. Data privacy is a serious concern. In my school, we use a “walled garden” version of AI so that student data doesn’t get used for training. Even with those safeguards in place, I remind colleagues never to enter identifiable student information into a tool.
Policies will continue to evolve, but for day-to-day activities and planning, teachers need to model caution and responsibility. Students are taking our lead.
Professional growth for a changing profession
The truth of the matter is most of us have not been professionally trained to do this. My teacher preparation program certainly did not include modules on prompt engineering or data ethics. That means professional development in this space is a must.
I’ve grown the most in my AI fluency by working alongside other educators who are experimenting, sharing stories, and comparing notes. AI is moving fast. No one has all the answers. But we can build confidence together by trying, reflecting, and adjusting through shared experience and lessons learned. That’s exactly what we’re doing in the Lead for Learners network. It’s a space where educators from across the country connect, learn and support one another in navigating change.
For educators who feel hesitant, I’d say this: You don’t need to be an expert to start. Pick one tool, test it in one lesson, and talk openly with your students about what you’re learning. They’ll respect your honesty and join you in the process.
Preparing students for what’s next
AI is not going away. Whether we’re ready or not, it’s going to shape how our students live and work. That gives us a responsibility not just to keep pace with technology but to prepare young people for what’s ahead. The latest futures forecast reminds us that imagining possibilities is just as important as responding to immediate shifts.
We need to understand both how AI is already reshaping education delivery and how new waves of change will remain on the horizon as tools grow more sophisticated and widespread.
I want my students to leave my classroom with the ability to question, create, and collaborate using AI. I want them to see it not as a shortcut but as a tool for thinking more deeply and expressing themselves more fully. And I want them to watch me modeling those same habits: curiosity, caution, creativity, and ethical decision-making. Because if we don’t show them what responsible use looks like, who will?
The future of education won’t be defined by whether we allow AI into our classrooms. It will be defined by how we teach with it, how we teach about it, and how we prepare our students to thrive in a world where it’s everywhere.
Ian McDougall, Yuma Union High School District
Ian McDougall is a history teacher and edtech coach at Yuma Union High School District in Arizona. He also facilitates the Lead for Learners Community, an online hub for learner-centered educators nationwide. With extensive experience in K–12 education and technology integration, Ian supports schools in adopting innovative practices through professional development and instructional coaching. He holds a master’s degree in United States history from Adams State University, further strengthening his expertise as both a teacher and coach.
Latest posts by eSchool Media Contributors (see all)
AI is now at the center of almost every conversation in education technology. It is reshaping how we create content, build assessments, and support learners. The opportunities are enormous. But one quiet risk keeps growing in the background: losing our habit of critical thinking.
I see this risk not as a theory but as something I have felt myself.
The moment I almost outsourced my judgment
A few months ago, I was working on a complex proposal for a client. Pressed for time, I asked an AI tool to draft an analysis of their competitive landscape. The output looked polished and convincing. It was tempting to accept it and move on.
Then I forced myself to pause. I began questioning the sources behind the statements and found a key market shift the model had missed entirely. If I had skipped that short pause, the proposal would have gone out with a blind spot that mattered to the client.
That moment reminded me that AI is fast and useful, but the responsibility for real thinking is still mine. It also showed me how easily convenience can chip away at judgment.
AI as a thinking partner
The most powerful way to use AI is to treat it as a partner that widens the field of ideas while leaving the final call to us. AI can collect data in seconds, sketch multiple paths forward, and expose us to perspectives we might never consider on our own.
In my own work at Magic EdTech, for example, our teams have used AI to quickly analyze thousands of pages of curriculum to flag accessibility issues. The model surfaces patterns and anomalies that would take a human team weeks to find. Yet the real insight comes when we bring educators and designers together to ask why those patterns matter and how they affect real classrooms. AI sets the table, but we still cook the meal.
There is a subtle but critical difference between using AI to replace thinking and using it to stretch thinking. Replacement narrows our skills over time. Stretching builds new mental flexibility. The partner model forces us to ask better questions, weigh trade-offs, and make calls that only human judgment can resolve.
Habits to keep your edge
Protecting critical thinking is not about avoiding AI. It is about building habits that keep our minds active when AI is everywhere.
Here are three I find valuable:
1. Name the fragile assumption Each time you receive AI output, ask: What is one assumption here that could be wrong? Spend a few minutes digging into that. It forces you to reenter the problem space instead of just editing machine text.
2. Run the reverse test Before you adopt an AI-generated idea, imagine the opposite. If the model suggests that adaptive learning is the key to engagement, ask: What if it is not? Exploring the counter-argument often reveals gaps and deeper insights.
3. Slow the first draft It is tempting to let AI draft emails, reports, or code and just sign off. Instead, start with a rough human outline first. Even if it is just bullet points, you anchor the work in your own reasoning and use the model to enrich–not originate–your thinking.
These small practices keep the human at the center of the process and turn AI into a gym for the mind rather than a crutch.
Why this matters for education
For those of us in education technology, the stakes are unusually high. The tools we build help shape how students learn and how teachers teach. If we let critical thinking atrophy inside our companies, we risk passing that weakness to the very people we serve.
Students will increasingly use AI for research, writing, and even tutoring. If the adults designing their digital classrooms accept machine answers without question, we send the message that surface-level synthesis is enough. We would be teaching efficiency at the cost of depth.
By contrast, if we model careful reasoning and thoughtful use of AI, we can help the next generation see these tools for what they are: accelerators of understanding, not replacements for it. AI can help us scale accessibility, personalize instruction, and analyze learning data in ways that were impossible before. But its highest value appears only when it meets human curiosity and judgment.
Building a culture of shared judgment
This is not just an individual challenge. Teams need to build rituals that honor slow thinking in a fast AI environment. Another practice is rotating the role of “critical friend” in meetings. One person’s task is to challenge the group’s AI-assisted conclusions and ask what could go wrong. This simple habit trains everyone to keep their reasoning sharp.
Next time you lean on AI for a key piece of work, pause before you accept the answer. Write down two decisions in that task that only a human can make. It might be about context, ethics, or simple gut judgment. Then share those reflections with your team. Over time this will create a culture where AI supports wisdom rather than diluting it.
The real promise of AI is not that it will think for us, but that it will free us to think at a higher level.
The danger is that we may forget to climb.
The future of education and the integrity of our own work depend on remaining climbers. Let the machines speed the climb, but never let them choose the summit.
Laura Ascione is the Editorial Director at eSchool Media. She is a graduate of the University of Maryland’s prestigious Philip Merrill College of Journalism.
K-12 IT leaders are under pressure from all sides–rising cyberattacks, the end of Windows 10 support, and the need for powerful new learning tools.
The good news: Windows 11 on Lenovo devices delivers more than an upgrade–it’s a smarter, safer foundation for digital learning in the age of AI.
Delaying the move means greater risk, higher costs, and missed opportunities. With proven ROI, cutting-edge protection, and tools that empower both teachers and students, the case for Windows 11 is clear.
1. Harness AI-powered educational innovation with Copilot Windows 11 integrates Microsoft Copilot AI capabilities that transform teaching and learning. Teachers can leverage AI for lesson planning, content creation, and administrative tasks, while students benefit from enhanced collaboration tools and accessibility features.
2. Combat the explosive rise in school cyberattacks The statistics are alarming: K-12 ransomware attacks increased 92 percent between 2022 and 2023, with human-operated ransomware attacks surging over 200 percent globally, according to the 2024 State of Ransomware in Education.
3. Combat the explosive rise in school cyberattacks Time is critically short. Windows 10 support ended in October 2025, leaving schools running unsupported systems vulnerable to attacks and compliance violations. Starting migration planning immediately ensures adequate time for device inventory, compatibility testing, and smooth district-wide deployment.
Laura Ascione is the Editorial Director at eSchool Media. She is a graduate of the University of Maryland’s prestigious Philip Merrill College of Journalism.
In the growing conversation around AI in education, speed and efficiency often take center stage, but that focus can tempt busy educators to use what’s fast rather than what’s best. To truly serve teachers–and above all, students–AI must be built with intention and clear constraints that prioritize instructional quality, ensuring efficiency never comes at the expense of what learners need most.
AI doesn’t inherently understand fairness, instructional nuance, or educational standards. It mirrors its training and guidance, usually as a capable generalist rather than a specialist. Without deliberate design, AI can produce content that’s misaligned or confusing. In education, fairness means an assessment measures only the intended skill and does so comparably for students from different backgrounds, languages, and abilities–without hidden barriers unrelated to what’s being assessed. Effective AI systems in schools need embedded controls to avoid construct‑irrelevant content: elements that distract from what’s actually being measured.
For example, a math question shouldn’t hinge on dense prose, niche sports knowledge, or culturally-specific idioms unless those are part of the goal; visuals shouldn’t rely on low-contrast colors that are hard to see; audio shouldn’t assume a single accent; and timing shouldn’t penalize students if speed isn’t the construct.
To improve fairness and accuracy in assessments:
Avoid construct-irrelevant content: Ensure test questions focus only on the skills and knowledge being assessed.
Use AI tools with built-in fairness controls: Generic AI models may not inherently understand fairness; choose tools designed specifically for educational contexts.
Train AI on expert-authored content: AI is only as fair and accurate as the data and expertise it’s trained on. Use models built with input from experienced educators and psychometricians.
These subtleties matter. General-purpose AI tools, left untuned, often miss them.
The risk of relying on convenience
Educators face immense time pressures. It’s tempting to use AI to quickly generate assessments or learning materials. But speed can obscure deeper issues. A question might look fine on the surface but fail to meet cognitive complexity standards or align with curriculum goals. These aren’t always easy problems to spot, but they can impact student learning.
To choose the right AI tools:
Select domain-specific AI over general models: Tools tailored for education are more likely to produce pedagogically-sound and standards-aligned content that empowers students to succeed. In a 2024 University of Pennsylvania study, students using a customized AI tutor scored 127 percent higher on practice problems than those without.
Be cautious with out-of-the-box AI: Without expertise, educators may struggle to critique or validate AI-generated content, risking poor-quality assessments.
Understand the limitations of general AI: While capable of generating content, general models may lack depth in educational theory and assessment design.
General AI tools can get you 60 percent of the way there. But that last 40 percent is the part that ensures quality, fairness, and educational value. This requires expertise to get right. That’s where structured, guided AI becomes essential.
Building AI that thinks like an educator
Developing AI for education requires close collaboration with psychometricians and subject matter experts to shape how the system behaves. This helps ensure it produces content that’s not just technically correct, but pedagogically sound.
To ensure quality in AI-generated content:
Involve experts in the development process: Psychometricians and educators should review AI outputs to ensure alignment with learning goals and standards.
Use manual review cycles: Unlike benchmark-driven models, educational AI requires human evaluation to validate quality and relevance.
Focus on cognitive complexity: Design assessments with varied difficulty levels and ensure they measure intended constructs.
This process is iterative and manual. It’s grounded in real-world educational standards, not just benchmark scores.
Personalization needs structure
AI’s ability to personalize learning is promising. But without structure, personalization can lead students off track. AI might guide learners toward content that’s irrelevant or misaligned with their goals. That’s why personalization must be paired with oversight and intentional design.
To harness personalization responsibly:
Let experts set goals and guardrails: Define standards, scope and sequence, and success criteria; AI adapts within those boundaries.
Use AI for diagnostics and drafting, not decisions: Have it flag gaps, suggest resources, and generate practice, while educators curate and approve.
Preserve curricular coherence: Keep prerequisites, spacing, and transfer in view so learners don’t drift into content that’s engaging but misaligned.
Support educator literacy in AI: Professional development is key to helping teachers use AI effectively and responsibly.
It’s not enough to adapt–the adaptation must be meaningful and educationally coherent.
AI can accelerate content creation and internal workflows. But speed alone isn’t a virtue. Without scrutiny, fast outputs can compromise quality.
To maintain efficiency and innovation:
Use AI to streamline internal processes: Beyond student-facing tools, AI can help educators and institutions build resources faster and more efficiently.
Maintain high standards despite automation: Even as AI accelerates content creation, human oversight is essential to uphold educational quality.
Responsible use of AI requires processes that ensure every AI-generated item is part of a system designed to uphold educational integrity.
An effective approach to AI in education is driven by concern–not fear, but responsibility. Educators are doing their best under challenging conditions, and the goal should be building AI tools that support their work.
When frameworks and safeguards are built-in, what reaches students is more likely to be accurate, fair, and aligned with learning goals.
In education, trust is foundational. And trust in AI starts with thoughtful design, expert oversight, and a deep respect for the work educators do every day.
Nick Koprowicz, Prometric
Nick Koprowicz is an applied AI scientist at Prometric, a global leader in credentialing and skills development.
Latest posts by eSchool Media Contributors (see all)
In my classroom, students increasingly ask for relevant content. Students want to know how what they are learning in school relates to the world beyond the classroom. They want to be engaged in their learning.
In fact, the 2025-2026 Education Insights Report vividly proves that students need and want engaging learning experiences. And it’s not just students who see engagement as important. Engagement is broadly recognized as a key driver of learning and success, with 93 percent of educators agreeing that student engagement is a critical metric for understanding overall achievement. What is more, 99 percent of superintendents believe student engagement is one of the top predictors of success at school.
Creating highly engaging lesson plans that will immerse today’s tech-savvy students in learning can be a challenge, but here are two easy-to-find resources that I can turn to turbo-charge the engagement quotient of my lessons:
Virtual field trips Virtual field trips empower educators to introduce students to amazing places, new people and ideas, and remarkable experiences–without ever leaving the classroom. There are so many virtual field trips out there, but I always love the ones that Discovery Education creates with partners.
I also love the virtual tours of the Smithsonian National Museum of Natural History. Together as a class or individually, students can dive into self-guided, room-by-room tours of several exhibits and areas within the museum from a desktop or smart device. This virtual field trip does include special collections and research areas, like ancient Egypt or the deep ocean. This makes it fun and easy for teachers like me to pick and choose which tour is most relevant to a lesson.
Immersive learning resources Immersive learning content offers another way to take students to new places and connect the wider world, and universe, to the classroom. Immersive learning can be easily woven into the curriculum to enhance and provide context.
One immersive learning solution I really like is TimePod Adventures from Verizon. It features free time-traveling episodes designed to engage students in places like Mars and prehistoric Earth. Now accessible directly through a web browser on a laptop, Chromebook, or mobile device, students need only internet access and audio output to begin the journey. Guided by an AI-powered assistant and featuring grade-band specific lesson plans, these missions across time and space encourage students to take control, explore incredible environments, and solve complex challenges.
Immersive learning content can be overwhelming at first, but professional development resources are available to help educators build confidence while earning microcredentials. These resources let educators quickly dive into new and innovative techniques and teaching strategies that help increase student engagement.
Taken together, engaging learning opportunities are ones that show students how classrooms learnings directly connect to their real lives. With resources like virtual field trips and immersive learning content, students can dive into school topics in ways that are fun, fresh, and sometimes otherworldly.
Leia J. DePalo, Northport-East Northport Union Free School District
Leia J. (LJ) DePalo is an Elementary STEM and Future Forward Teacher (FFT) in the Northport-East Northport School District with over 20 years of experience in education. LJ holds a Master of Science in Literacy and permanent New York State teaching certifications in Elementary Education, Speech, and Computer Science. A dedicated innovator, she collaborates with teachers to design technology-infused lessons, leads professional development, and choreographs award-winning school musicals. In recognition of her creativity and impact, DePalo was named a 2025 Innovator Grant recipient.
Latest posts by eSchool Media Contributors (see all)
Leia J. DePalo, Northport-East Northport Union Free School District
Today’s school IT teams juggle endless demands–secure systems, manageable devices, and tight budgets–all while supporting teachers who need tech that just works.
That’s where interactive displays come in. Modern, OS-agnostic solutions like Promethean’s ActivPanel 10 Premium simplify IT management, integrate seamlessly with existing systems, and cut down on maintenance headaches. For schools, that means fewer compatibility issues, stronger security, and happier teachers.
But these tools do more than make IT’s job easier–they transform teaching and learning. Touch-enabled collaboration, instant feedback, and multimedia integration turn passive lessons into dynamic, inclusive experiences that keep students engaged and help teachers do their best work.
Built to last, interactive displays also support long-term sustainability goals and digital fluency–skills that carry from classroom to career.
Download the full report and see how interactive solutions can help your district simplify IT, elevate instruction, and create future-ready classrooms.
Laura Ascione is the Editorial Director at eSchool Media. She is a graduate of the University of Maryland’s prestigious Philip Merrill College of Journalism.
By now, the 2025-2026 school year is well underway. The glow of new beginnings has faded, and the process of learning has begun in earnest. No doubt there is plenty to do, but I recommend that educators take a moment and check in on their teaching toolkit.
The tools of our trade are always evolving, and if our students are going to get the most out of their time in class, it’s important for us to familiarize ourselves with the newest resources for sparking curiosity, creativity, and critical thinking. This includes the latest AI programs that are making their way into the classroom.
Here are five AI tech tools that I believe are essential for back-to-school success:
ChatGPT: ChatGPT has quickly become the all-in-one tool for generating anything and everything. Many educators are (rightly) concerned about ChatGPT’s potential for student cheating, but this AI can also serve as a built-in assistant for creating welcome letters, student-friendly syllabi, and other common documents for the classroom. If it’s used responsibly, ChatGPT can assist teachers by cutting out the busy work involved when planning and implementing lessons.
ClassroomScreen: ClassroomScreen functions as a modern-day chalkboard. This useful tool lets teachers project a variety of information on screen while simultaneously performing classroom tasks. Teachers can take straw polls, share inspiring quotes, detail the morning schedule, and even monitor volume without opening a single tab. It’s a simple, multipurpose tool for classroom coordination.
SchoolAI: SchoolAI is a resource generator that provides safe, teacher-guided interactions between students and AI. With AI becoming increasingly common, it’s vital that students are taught how to use it safely, effectively, and responsibly. SchoolAI can help with this task by cultivating student curiosity and critical thinking without doing the work for them. Best of all, teachers remain at the helm the entire time, ensuring an additional layer of instruction and protection.
Snorkl: Snorkl is a feedback tool, providing students with instant feedback on their responses. This AI program allows students to record their thinking process on a digital whiteboard using a variety of customizable tools. With Snorkl, a teacher could send students a question with an attached image, then have them respond using audio, visual tools such as highlighting, and much more. It’s the perfect way to inject a little creativity into a lesson while making it memorable, meaningful, and fun!
Suno: Suno is unique in that it specializes in creative song generation. Looking for an engaging way to teach fractions? Upload your lesson to Suno and it can generate a catchy, educational song in the style of your favorite artist. Suno even allows users to customize lyrics so that the songs stay relevant to the lesson at hand. If you need a resource that can get students excited about learning, then Suno will be the perfect addition to your teaching toolkit!
The world of education is always changing, and today’s technology may be outdated within a matter of years. Still, the mission of educators remains the same: to equip students with the skills, determination, and growth mindset they need to thrive in an uncertain future. By integrating effective tools into the classroom, we can guide them toward a brighter tomorrow–one where inquiry and critical thinking continue to flourish, both within the classroom and beyond.
Jamie MacPherson, Van Andel Institute for Education
Jamie MacPherson is a Learning Specialist at Van Andel Institute for Education, a Michigan-based education nonprofit dedicated to creating classrooms where curiosity, creativity, and critical thinking thrive.
Latest posts by eSchool Media Contributors (see all)
Jamie MacPherson, Van Andel Institute for Education
I’ll admit that I use AI. I’ve asked it to help me figure out challenging Excel formulas that otherwise would have taken me 45 minutes and a few tutorials to troubleshoot. I’ve used it to help me analyze or organize massive amounts of information. I’ve even asked it to help me devise a running training program aligning with my goals and fitting within my schedule. AI is a fantastic tool–and that’s the point. It’s a tool, not a replacement for thinking.
As AI tools become more capable, more intuitive, and more integrated into our daily lives, I’ve found myself wondering: Are we growing too dependent on AI to do our thinking for us?
This question isn’t just philosophical. It has real consequences, especially for students and young learners. A recent study published in the journal Societies reports that people who used AI tools consistently showed a decline in critical thinking performance. In fact, “whether someone used AI tools was a bigger predictor of a person’s thinking skills than any other factor, including educational attainment.” That’s a staggering finding because it suggests that using AI might not just be a shortcut. It could be a cognitive detour.
The atrophy of the mind
The term “digital dementia” has been used to describe the deterioration of cognitive abilities as a result of over-reliance on digital devices. It’s a phrase originally associated with excessive screen time and memory decline, but it’s found new relevance in the era of generative AI. When we depend on a machine to generate our thoughts, answer our questions, or write our essays, what happens to the neural pathways that govern our own critical thinking? And will the upcoming era of agentic AI expedite this decline?
Cognitive function, like physical fitness, follows the rule of “use it or lose it.” Just as muscles weaken without regular use, the brain’s ability to evaluate, synthesize, and critique information can atrophy when not exercised. This is especially concerning in the context of education, where young learners are still building those critical neural pathways.
In short: Students need to learn how to think before they delegate that thinking to a machine.
Can you still think critically with AI?
Yes, but only if you’re intentional about it.
AI doesn’t relieve you of the responsibility to think–in many cases, it demands even more critical thinking. AI produces hallucinations, falsifies claims, and can be misleading. If you blindly accept AI’s output, you’re not saving time, you’re surrendering clarity.
Using AI effectively requires discernment. You need to know what you’re asking, evaluate what you’re given, and verify the accuracy of the result. In other words, you need to think before, during, and after using AI.
The “source, please” problem
One of the simplest ways to teach critical thinking is also the most annoying–just ask my teenage daughter. When she presents a fact or claim that she saw online, I respond with some version of: “What’s your source?” It drives her crazy, but it forces her to dig deeper, check assumptions, and distinguish between fact and fiction. It’s an essential habit of mind.
But here’s the thing: AI doesn’t always give you the source. And when it does, sometimes it’s wrong, or the source isn’t reputable. Sometimes it requires a deeper dive (and a few more prompts) to find answers, especially to complicated topics. AI often provides quick, confident answers that fall apart under scrutiny.
So why do we keep relying on it? Why are AI responses allowed to settle arguments, or serve as “truth” for students when the answers may be anything but?
The lure of speed and simplicity
It’s easier. It’s faster. And let’s face it: It feels like thinking. But there’s a difference between getting an answer and understanding it. AI gives us answers. It doesn’t teach us how to ask better questions or how to judge when an answer is incomplete or misleading.
This process of cognitive offloading (where we shift mental effort to a device) can be incredibly efficient. But if we offload too much, too early, we risk weakening the mental muscles needed for sustained critical thinking.
Implications for educators
So, what does this mean for the classroom?
First, educators must be discerning about how they use AI tools. These technologies aren’t going away, and banning them outright is neither realistic nor wise. But they must be introduced with guardrails. Students need explicit instruction on how to think alongside AI, not instead of it.
Second, teachers should emphasize the importance of original thought, iterative questioning, and evidence-based reasoning. Instead of asking students to simply generate answers, ask them to critique AI-generated ones. Challenge them to fact-check, source, revise, and reflect. In doing so, we keep their cognitive skills active and growing.
And finally, for young learners, we may need to draw a harder line. Students who haven’t yet formed the foundational skills of analysis, synthesis, and evaluation shouldn’t be skipping those steps. Just like you wouldn’t hand a calculator to a child who hasn’t yet learned to add, we shouldn’t hand over generative AI tools to students who haven’t learned how to write, question, or reason.
A tool, not a crutch
AI is here to stay. It’s powerful, transformative, and, when used well, can enhance our work and learning. But we must remember that it’s a tool, not a replacement for human thought. The moment we let it think for us is the moment we start to lose the capacity to think for ourselves.
If we want the next generation to be capable, curious, and critically-minded, we must protect and nurture those skills. And that means using AI thoughtfully, sparingly, and always with a healthy dose of skepticism. AI is certainly proving it has staying power, so it’s in all our best interests to learn to adapt. However, let’s adapt with intentionality, and without sacrificing our critical thinking skills or succumbing to any form of digital dementia.
Laura Hakala, Magic EdTech
Laura Hakala is the Director of Online Program Design and Efficacy for Magic EdTech. With nearly two decades of leadership and strategic innovation experience, Laura is a go-to resource for content, problem-solving, and strategic planning. Laura is passionate about DE&I and is a fierce advocate, dedicated to making meaningful changes. When it comes to content management, digital solutions, and forging strategic partnerships, Laura’s expertise shines through. She’s not just shaping the future; she’s paving the way for a more inclusive and impactful tomorrow.
Latest posts by eSchool Media Contributors (see all)
When I stepped into the role of curriculum coordinator for Peoria Public Schools District 150 in 2021, I entered a landscape still reeling from the disruption of COVID-19. Teachers were exhausted. Students had suffered interrupted learning. And the instructional frameworks in place–particularly in literacy–were due for serious reexamination.
Initially, the directive was to return to our previous Balanced Literacy framework. But as I dove into research, attended conferences, and listened to thought leaders in the field, it became clear: The science was pointing in a different direction. The evidence base for Structured Literacy was too compelling to ignore.
What followed wasn’t an overnight change. It was a careful, multi-year shift in philosophy, practice, and support. We didn’t have the budget for a full curriculum adoption, so we focused on building a practical, research-aligned framework using targeted resources and strategic professional learning.
A patchwork quilt with purpose
In Peoria, where many students were performing one or two grade levels below benchmarks, we needed a literacy framework that could both repair learning gaps and accelerate grade-level achievement. That meant honoring the complexity of literacy instruction by balancing foundational skills, writing, vocabulary, and fluency.
Our current model includes explicit handwriting instruction, structured phonics and phonemic awareness, and targeted word study, paired with guided small-group instruction informed by student data. We built in an hour each day for foundational work, and another for what we call “guided individual practice,” where students receive support aligned to their needs–not just grade-level expectations.
We were also honest about staffing realities. We no longer had interventionists or instructional coaches in every building. The burden of differentiation had shifted to classroom teachers, many of whom were navigating outdated practices. Transitioning from “guided reading” to true data-informed small groups required more than new tools. It required a new mindset.
Supporting educators without overwhelming them
Change management in literacy instruction is, at its core, about supporting teachers. We’ve been intentional in how we provide professional development. Our work with the Lexia LETRS professional learning course has been especially transformative. Recognizing the intensity of the full cohort model, we supplemented it with a more flexible, self-guided version that teachers could complete during PLC time. Today, every 1st and 2nd grade teacher in Peoria has completed Volume 1 of the professional learning course, and our next cohort is set to begin with kindergarten and third-grade educators.
That blended approach–respecting teachers’ time while still delivering deep learning–is helping us move forward together. Our educators understand the “why” behind the change and are beginning to feel empowered by the “how.”
Technology as a partner, not a solution
Technology plays a meaningful role in our framework, but never in isolation. We initially implemented a digital literacy program for students in grades 5-8 who were below benchmark, but the rollout revealed key challenges. Students were resistant. Teachers lacked the training to connect software data to instruction. And the result felt more punitive than supportive.
Rather than abandon technology, we shifted our model. We now provide Lexia Core5 Reading to every student in grades 2-4, creating a consistent, equitable implementation that supports differentiated instruction while relieving teachers of the burden of sourcing materials themselves. The program is easy to use, offers actionable reports, and provides a strong starting point for targeted instruction.
Still, we’ve been clear: Software alone won’t move the needle. Teachers must be part of the equation. We continue to train educators on blended learning practices, helping them use technology as a springboard, not a substitute, for effective instruction.
From compliance to commitment
One of our next major shifts is moving from compliance to intentional practice. In a large district with approximately 13,000 students across 29 buildings, it’s easy to focus on usage metrics. Are students meeting their minutes? Are teachers checking boxes?
But the true measure is learning. Are students making progress? Are teachers using the data to inform instruction?
We’re investing in professional development that reinforces this mindset and are exploring how to bring more coaching and modeling into classrooms to help operationalize what teachers are learning.
Advice for fellow district leaders
If there’s one takeaway from our journey, it’s this: Don’t rush. Take the time to align every piece of your literacy framework with evidence-based practices. That includes everything from phonics and handwriting to the way letters are introduced and small groups are formed.
Lean on the research, but also listen to your teachers. Usability and educator buy-in matter just as much as alignment. And remember, literacy is a long game. State assessments, early screeners, and benchmark data are just pieces of the puzzle. The real impact takes time.
What keeps me going is the feedback from our teachers. They’re seeing students blend and segment words with confidence. They’re noticing fewer behavioral issues during literacy blocks. They’re asking deeper questions about how to support readers. That’s the kind of progress that truly matters.
We’re not finished. But we’re headed in the right direction–and we’re doing it together.
Lindsay Bohm, Peoria (Ill.) Public Schools District 150
Lindsay Bohm serves as the Curriculum Coordinator for Peoria (Ill.) Public Schools District 150.
Latest posts by eSchool Media Contributors (see all)
Lindsay Bohm, Peoria (Ill.) Public Schools District 150
For students with special needs, learning can often resemble a trek through dense woods along a narrow, rigid path–one that leaves little to no room for individual exploration. But the educational landscape is evolving. Picture classrooms as adventurous hunts, where every learner charts their own journey, overcomes unique challenges, and progresses at a pace that matches their strengths. This vision is becoming reality through gamification, a powerful force that is reshaping how students learn and how teachers teach in K–12 special education.
Personalized learning paths: Tailoring the adventure
Traditional classrooms often require students to adapt one method of instruction, which can be limiting–especially for neurodiverse learners. Gamified learning platforms provide an alternative by offering adaptive, personalized learning experiences that honor each student’s profile and pace.
Many of these platforms use real-time data and algorithms to adjust content based on performance. A student with reading difficulties might receive simplified text with audio support, while a math-savvy learner can engage in increasingly complex logic puzzles. This flexibility allows students to move forward without fear of being left behind, or without being bored waiting for others to catch up.
Accessibility features such as customizable avatars, voice commands, and adjustable visual settings also create space for students with ADHD, autism, or sensory sensitivities to learn comfortably. A student sensitive to bright colors can use a softer palette; another who struggles with reading can use text-to-speech features. And when students can replay challenges without stigma, repetition becomes practice, not punishment.
In these environments, progress is measured individually. The ability to choose which goals to tackle and how to approach them gives learners both agency and confidence–two things often missing in traditional special education settings.
Building social and emotional skills: The power of play
Play is a break from traditional learning and a powerful way to build essential social and emotional skills. For students with special needs who may face challenges with communication, emotional regulation, or peer interaction, gamified environments provide a structured yet flexible space to develop these abilities.
In cooperative hunts and team challenges, students practice empathy, communication, and collaboration in ways that feel engaging and low-stakes. A group mission might involve solving a puzzle together, requiring students to share ideas, encourage one another, and work toward a common goal.
Gamified platforms also provide real-time, constructive feedback, transforming setbacks into teachable moments. Instead of pointing out what a student did wrong, a game might offer a helpful hint: “Try checking the clues again!” This kind of support teaches resilience and persistence in a way that lectures or punitive grading rarely do.
As students earn badges or level up, they experience tangible success. These moments highlight the connection between effort and achievement. Over time, these small wins raise a greater willingness to engage with the material and with peers and the classroom community.
Fostering independence and motivation
Students with learning differences often carry the weight of repeated academic failure, which can chip away at their motivation. Gamification helps reverse this by reframing challenges as opportunities and effort as progress.
Badges, points, and levels make achievements visible and meaningful. A student might earn a “Problem Solver” badge after tackling a tricky math puzzle or receive “Teamwork Tokens” for helping a classmate. These systems expand the definition of success and highlight personal strengths.
The focus shifts from comparison to self-improvement. Some platforms even allow for private progress tracking, letting students set and meet personal goals without the anxiety of public rankings. Instead of competing, students build a personal narrative of growth.
Gamification also encourages self-directed learning. As student complete tasks, they develop skills like planning, time management, and self-assessment, skills that extend beyond academics and into real life. The result is a deeper sense of ownership and independence.
Teachers as learning guides
Gamification doesn’t replace teachers, but it can help teach more effectively. With access to real-time analytics, educators can see exactly where a student is excelling or struggling and adjust instruction accordingly.
Dashboards might reveal that a group of students is thriving in reading comprehension but needs help with number sense, prompting immediate, targeted intervention. This data-driven insight allows for proactive, personalized support.
Teachers in gamified classrooms also take on a new role, both of a mentor and facilitator. They curate learning experiences, encourage exploration, and create opportunities for creativity and curiosity to thrive. Instead of managing behavior or delivering lectures, they support students on individualized learning journeys.
Inclusion reimagined
Gamification is not a gimmick; it’s a framework for true inclusion. It aligns with the principles of Universal Design for Learning (UDL), offering multiple ways for students to engage, process information, and show what they know. It recognizes that every learner is different, and builds that into the design.
Of course, not every gamified tool is created equal. Thoughtful implementation, equity in access, and alignment with student goals are essential. But when used intentionally, gamification can turn classrooms into places where students with diverse needs feel seen, supported, and excited to learn.
Are we ready to level up?
Gamification is a step toward classrooms that work for everyone. For students with special needs, it means learning at their own pace, discovering their strengths, and building confidence through meaningful challenges.
For teachers, it’s a shift from directing traffic to guiding adventurers.
If we want education to be truly inclusive, we must go beyond accommodations and build systems where diversity is accepted and celebrated. And maybe, just maybe, that journey begins with a game.
Aditya Prakash, SKIDOS
Aditya Prakash is the founder and CEO of SKIDOS, a Copenhagen-based edtech leader transforming mobile gaming into learning, drawing on two decades of innovation, investment, and mentorship in technology-driven education.
Latest posts by eSchool Media Contributors (see all)
Researchers at the University of Kansas have produced a set of guidelines to help educators from preschool through higher education responsibly implement artificial intelligence in a way that empowers teachers, parents, students and communities alike.
Earlier this year, President Donald Trump issued an executive order instructing schools to incorporate AI into their operations. The framework is intended to help all schools and educational facilities do so in a manner that fits their unique communities and missions.
“We see this framework as a foundation,” said James Basham, director of CIDDL and professor of special education at KU. “As schools consider forming an AI task force, for example, they’ll likely have questions on how to do that, or how to conduct an audit and risk analysis. The framework can help guide them through that, and we’ll continue to build on this.”
The framework features four primary recommendations.
Establish a stable, human-centered foundation.
Implement future-focused strategic planning for AI integration.
Ensure AI educational opportunities for every student.
Conduct ongoing evaluation, professional learning and community development.
First, the framework urges schools to keep humans at the forefront of AI plans, prioritizing educator judgment, student relationships and family input on AI-enabled processes and not relying on automation for decisions that affect people. Transparency is also key, and schools should communicate how AI tools work, how decisions are made and ensure compliance with student protection laws such as the Individuals with Disabilities Education Act and Family Education Rights and Privacy Act, the report authors write.
The document also outlines recommendations for how educational facilities can implement the technology. Establishing an AI integration task force including educators, administrators, families, legal advisers and specialists in instructional technology and special education is key among the recommendations. The document also shares tips on how to conduct an audit and risk analysis before adoption and consider how tools can affect student placement and identification and consider possible algorithmic error patterns. As the technologies are trained on human data, they run the risk of making the same mistakes and repeating biases humans have made, Basham said.
That idea is also reflected in the framework’s third recommendation. The document encourages educators to commit to learner-centered AI implementation that considers all students, from those in gifted programs to students with cognitive disabilities. AI tools should be prohibited from making final decisions on IEP eligibility, disciplinary actions and student progress decisions, and mechanisms should be installed that allow for feedback on students, teachers and parents’ AI educational experiences, the authors wrote.
Finally, the framework urges ongoing evaluation, professional learning and community development. As the technology evolves, schools should regularly re-evaluate it for unintended consequences and feedback from those who use it. Training both at implementation and in ongoing installments will be necessary to address overuse or misuse and clarify who is responsible for monitoring AI use and to ensure both the school and community are informed on the technology.
The framework was written by Basham; Trey Vasquez, co-principal investigator at CIDDL, operating officer at KU’s Achievement & Assessment Institute and professor of special education at KU; and Angelica Fulchini Scruggs, research associate and operations director for CIDDL.
“The priority at CIDDL is to share transparent resources for educators on topics that are trending and in a way that is easy to digest,” Fulchini Scruggs said. “We want people to join the community and help them know where to start. We also know this will evolve and change, and we want to help educators stay up to date with those changes to use AI responsibly in their schools.”
Mike Krings, the University of Kansas
Mike Krings is a Public Affairs Officer with the KU News Service at the University of Kansas.
Latest posts by eSchool Media Contributors (see all)
Every educator hopes to instill a lifelong love of learning within their students. We strive to make each lesson engaging, while igniting a sense of curiosity, wonder, and discovery in every child.
Unfortunately, we don’t always succeed, and recent reports suggest that today’s students are struggling to connect with the material they’re taught in school–particularly when it comes to STEM. While there are many potential culprits behind these numbers (shortened attention spans, the presence of phones, dependency on AI, etc.), educators should still take a moment to reflect and strategize when preparing a new lesson for their class. If we truly want to foster a growth mindset within our students, we need to provide lessons that invite them to embrace the learning process itself.
One way to accomplish this is through gamification. Gamification brings the motivational elements of games into your everyday lessons. It increases student engagement, builds perseverance, and promotes a growth mindset. When used strategically, it helps learners take ownership of their progress and encourages creativity and collaboration without sacrificing academic rigor.
Here are just 4 ways that educators can transform their classroom through playful gamification:
Introduce points and badges: Modern video games like Pokémon and Minecraft frequently use achievements to guide new players through the gaming process. Teachers can do the same by assigning points to different activities that students can acquire throughout the week. These experience points can also double as currency that students can exchange for small rewards, such as extra free time or an end-of-year pizza party.
Create choice boards: Choice boards provide students with a range of task options, each with a point value or challenge level. You can assign themes or badges for completing tasks in a certain sequence (e.g., “complete a column” or “complete one of each difficulty level”). This allows students to take ownership of their learning path and pace, while still hitting key learning targets.
Host a digital breakout: Virtual escape rooms and digital breakouts are great for fostering engagement and getting students to think outside the box. By challenging students to solve content-based puzzles to unlock “locks” or progress through scenarios, they’re encouraged to think creatively while also collaborating with their peers. They’re the ideal activity for reviewing classwork and reinforcing key concepts across subjects.
Boss battle assessments: This gamified review activity has students “battle” a fictional character by answering questions or completing tasks. Each correct response helps them defeat the boss, which can be tracked with points, health bars, or progress meters. This engaging format turns practice into a collaborative challenge, building excitement and reinforcing content mastery.
When implemented correctly, gamification can be incredibly fun and rewarding for our students. With the fall semester drawing closer, there has never been a better time to prepare lessons that will spark student curiosity, creativity, and critical thinking.
We can show our students that STEM learning is not a chore, but a gateway to discovery and excitement. So, get your pencils ready, and let the games begin.
Cory Kavanagh, Van Andel Institute for Education
Cory Kavanagh is a Learning Specialist at Van Andel Institute for Education, a Michigan-based education nonprofit dedicated to creating classrooms where curiosity, creativity, and critical thinking thrive.
Latest posts by eSchool Media Contributors (see all)
In the quickly evolving landscape of AI, education stands at the forefront. New AI tools are emerging daily for educators and students; from AI tutors to curriculum creators, the AI education market is surging.
However, the long-term impact of AI use on students is unknown. As educational AI research tries to keep up with AI development, questions remain surrounding the impact of AI use on student motivation and overall learning. These questions are particularly significant for students of color, who consistently encounter more systemic barriers than their white peers (Frausto et al., 2024).
Emerging in the wake of the COVID-19 pandemic and related declines in student learning and motivation, AI refers to a broad range of technologies, including tools such as ChatGPT, that use vast data repositories to make decisions and problem-solve. Because the tool can assist with assignments like generating essays from prompts, students quickly integrated these technologies into the classroom. Although educators and administrators were slower to adopt these technologies, they have started using AI both to manage unregulated student usage and to streamline their work with AI-powered grading tools. While the use of AI in education remains controversial, it is clear that it is here to stay and, if anything, is rapidly evolving. The question remains: Can AI enhance students’ motivation and learning?
A recent rapid review of research concluded that students’ motivation is impacted by their experiences in and out of the classroom. The review highlights how student motivation is shaped by more than just individual attitudes, behaviors, beliefs, and traits, but it does not comprehensively address the effects of AI on student motivation (Frausto et al., 2024).
To understand how AI may impact the motivation and learning of students of color, we need to examine the nature of AI itself. AI learns and develops based on preexisting datasets, which often reflect societal biases and racism. This reliance on biased data can lead to skewed and potentially harmful outputs. For example, AI-generated images are prone to perpetuating stereotypes and cliches, such as exclusively generating images of leaders as white men in suits. Similarly, if we were to use AI to generate a leadership curriculum, it would be prone to create content that aligns with this stereotype. Not only does this further enforce the stereotype and subject students to it, but it can create unrelatable content leading students of color to disengage from learning and lose motivation in the course altogether (Frausto et al., 2024).
This is not to say that AI is a unique potential detractor. Discrimination is a persistent factor in the real world that affects students’ motivational and learning experiences, and similar bias has previously been seen in non-AI learning and motivation tools that have been created based on research centering predominantly white, middle-class students (Frausto et al., 2024). If anything, AI only serves as a reflection of the biases that exist within the broader world and education sphere; AI learns from real data, and the biases it perpetuates reflect societal trends. The biases of AI are not mystical; they are very much a mirror of our own. For example, teachers also demonstrate comparable levels of bias to the world around them.
When we think about current AI use in education, these baked-in biases can already be cause for concern. On the student use end, AIs have demonstrated subtle racism in the form of a dialect prejudice: students using African American Vernacular English (AAVE) may find that the AIs they communicate with offer them less favorable recommendations than their peers. For teachers, similar bias may impact the grades AI-powered programs assign students, preferring the phrasing and cultural perspectives used in white students’ essays over those of students of color. These are just a few examples of the biases present in current AI use in education, but they already raise alarms. Similar human-to-human instances of discrimination, such as from teachers and peers, have been linked to decreased motivation and learning in students of color (Frausto et al., 2024). In this way, it seems AI and its biases may be situated to serve as another obstacle that students of color are required to face; AI learning tools and supports that have been designed for and tested on white students to a positive effect may negatively affect students of color due to inbuilt biases.
For humans, we recommend anti-bias practices to overcome these perceptions. With AI, we may yet have an opportunity to incorporate similar bias awareness and anti-discriminatory practices. Such training for AI has been a prominent point in the conversation around responsible AI creation and use for several years, with companies such as Google releasing AI guidelines with an emphasis on addressing bias in AI systems development. Approaching the issue of AI bias with intentionality can help to circumvent discriminative outputs, such as by intentionally selecting large and diverse datasets to train AI from and rigorously testing them with diverse populations to ensure equitable outcomes. However, even after these efforts, AI systems may remain biased toward certain cultures and contexts. Even good intentions to support student learning and motivation with AI may lead to unintended outcomes for underrepresented groups.
While AI-education integration is already occurring rapidly, there is an opportunity to address and understand the potential for bias and discrimination from the outset. Although we cannot be certain of AI’s impact on the motivational and educational outcomes for students of color, research sets a precedent for bias as a detractor. By approaching the implementation of AI in education with intentionality and inclusivity of perspectives, as well as awareness of potential harm, we can try to circumvent the inevitable and instead create an AI-powered learning environment that enhances the learning experiences of all students.
Eliana Whitehouse, EduDream
Eliana Whitehouse is a macro social worker with experience in supporting community-based initiatives and research throughout the lifespan. Currently, she is a Research and Evaluation Associate at EduDream, a Latina-founded, women-owned education research consulting firm.
Latest posts by eSchool Media Contributors (see all)