What the Moratorium Is Really Asking
More than a hundred New Yorkers packed a seven-hour school board meeting this week to demand a two-year pause on AI in schools. The most striking voice in the room belonged to the panel's resident AI engineer. The pause they want is not really a pause.
On the evening of April 29, more than a hundred New Yorkers filed into a Panel for Educational Policy meeting in lower Manhattan and stayed for seven hours.1 Parents, students, and teachers came one after another to a microphone and asked the city's Department of Education to do something it has never done before: stop, for two years, before it puts artificial intelligence any deeper into the country's largest public school system.
The most surprising voice belonged to Naveed Hasan, the panel's resident technology expert. Hasan holds a master's in artificial intelligence and computer vision from Columbia and sits on the DOE's data privacy working group. He had previously supported the city's plans for a new AI-focused high school. That night, he announced he had changed his mind. He now backed a two-year moratorium on AI in schools, citing data privacy infrastructure and unanswered questions about learning.1
It is unusual for the engineer in the room to ask the room to slow down. It is worth paying attention when one does.
What Was Withdrawn
Two days before the meeting, Schools Chancellor Kamar Samuels had quietly pulled the proposal for Next Generation Technology High School off the agenda, hours before the panel was scheduled to vote on it.2 The school was supposed to open this fall at 26 Broadway and become the city's first campus built around artificial intelligence. Promotional language promised "special access to technology industry mentors" and certifications tied to large tech firms. Admissions would have been screened, sorting students by performance metrics in a city already arguing about who gets into selective programs.2
Parents organizing under the AI Moratorium Committee and Class Size Matters used a sharper word for what was being proposed: an experiment.3 They asked, in essence, why their children were being signed up as test subjects for tools whose effects on learning have not been measured at scale, with vendors whose data practices have not been independently audited, on a timeline set by industry rather than by research.
What the Moratorium Is Actually Asking
Read carefully, the moratorium movement is not really an argument against AI. It is an argument about evidence and authority. The resolution circulating in NYC asks for a pause until the city can verify three things: that student data is not training private models, that classroom tools have been evaluated for actual learning outcomes, and that teachers have meaningful say in what gets adopted.3 None of those requests are anti-technology. All of them are pro-knowing-what-you-are-doing.
This is the same conversation, in a different register, that nervous principals are having in Iowa and worried superintendents are having in Florida. The country has spent two and a half years rolling out AI tools faster than the people responsible for children can study them. The moratorium is asking what should have been asked at the start. What evidence do we have? Who is collecting it? Where does the process live?
The Future Is Not Faster, It Is Visible
It is tempting to read the NYC meeting as a referendum on AI itself. It is more accurate to read it as a referendum on opacity. Parents are not afraid of computers. They are afraid of systems that make decisions about their children without leaving a record they can inspect. The Next Generation Technology High School proposal failed not because it used AI, but because the model of how it would use AI read like a press release rather than a curriculum. Industry mentors. Certifications. Buzzwords that travel in a sentence and disappear under follow-up questions.
The future of school technology will not be measured in adoption rates. It will be measured in legibility. A tool that a teacher can read, that a parent can audit, and that a student can reflect on, is a tool worth adopting. A tool that returns a percentage and asks to be trusted is the kind of tool the moratorium is trying to slow.
This is the design we keep returning to at Koan. Aidan, our AI tutor, is built to make the learning process visible: every revision, every long pause, every moment a student tries a sentence and deletes it. Not as surveillance, but as the evidence base of the work. A teacher, a parent, even the student themselves can scroll back through the record and see how the thinking actually moved. That kind of visibility is the precondition for trust, and trust is what the seven-hour meeting was really about.
The Engineer Who Said Wait
Naveed Hasan's reversal will probably outlast the news cycle. The image of a panel's most credentialed AI voice asking for two years of patience is not a defeat for the technology. It is a maturation. Engineers usually ask the room to move faster. When the engineer asks the room to wait, the room is approaching a real problem honestly. The problem in this case is not whether AI belongs in schools. The problem is whether anyone has yet built the systems we are deploying to be accountable to the children inside them.
The DOE is taking public comment on its preliminary AI guidance through May 8 and plans a fuller playbook in June.1 Whatever that document looks like, the most useful question for any school in the country is not whether to adopt AI. It is whether the AI being adopted leaves a record a thoughtful adult could actually read.
If the parents asking for a moratorium are really asking for visibility, what is the version of AI in school they would say yes to?
References
The AI rebellion grows in NYC: Parents and students demand moratorium at marathon meeting
Chalkbeat · May 2026
NYC spikes proposals to open AI-focused high school, close Manhattan middle schools
Chalkbeat · April 2026
Resolution for an AI Moratorium in NYC Schools
Class Size Matters · April 2026
Sources cited in order of appearance. Click any inline number to jump.