Why AI does not "improve” education—it makes the old model logically unsustainable

Artificial intelligence is not simply improving education — it is exposing the limits of the traditional model. As AI reshapes how people learn, reason, and make decisions, education is shifting from knowledge transfer to cognitive capability: adaptability, sense-making, and judgment. Based on insights from the Machines Can Think conference in Abu Dhabi, this article explores how AI is transforming education, assessment, and the role of teachers — and why the same logic increasingly applies to parents and children. It argues that AI-native learning is not about answers, but about building thinking spaces that support better decisions, both in classrooms and in families.

Sergei Andriiashkin

Founder and Strategy Partner

AI

/

Feb 4, 2026

AI: Within Us, Around Us, Beyond Us — The Future at the Crossroads of People, Business and the City
AI: Within Us, Around Us, Beyond Us — The Future at the Crossroads of People, Business and the City
AI: Within Us, Around Us, Beyond Us — The Future at the Crossroads of People, Business and the City

At Vinden.one, we visited the Machines Can Think Conference in Abu Dhabi (January 26–27, 2026), one of the region’s most substantive executive-level AI forums, focused not on speculative futures but on how artificial intelligence is already reshaping real systems — from government and science to organizations and education.

Among the sessions we attended, one panel discussion stood out in particular. Titled “AI Classroom: Rethinking Education for the Knowledge Economy,” it brought together university presidents, provosts, and AI research leaders — Merouane Debbah (Khalifa University), Timothy Baldwin (MBZUAI), Arlie Petters (NYU Abu Dhabi), Ebrahim Al Hajri (Khalifa University of Science and Technology), and Horst Simon (ADIA Lab). The conversation moved quickly beyond tactical questions of AI adoption in classrooms and focused instead on a deeper issue: how the core assumptions of education itself are being challenged at a moment when access to knowledge is no longer the primary constraint.

From knowledge transfer to cognitive capability

Most panelists converged on the same pivot: AI doesn’t radically change the purpose of education—it exposes how fragile the traditional model had already become. Memorization, reproduction, standardized exams — these stop being a meaningful advantage because a student who merely memorizes is no longer above the bar. The bar moved. What becomes central instead:

  • adaptability and learning agility;

  • sense-making — interpreting, testing, and critiquing AI outputs rather than trusting them blindly;

  • resilience in the face of uncertainty and contradiction;

  • independent, self-directed learning;

  • emotional and social intelligence — the human layer AI does not replace.

AI doesn’t eliminate thinking. It eliminates the value of knowledge without understanding.

AI as a tutor—and as a “thinking space”

One of the most powerful conceptual moves came from the NYU Abu Dhabi perspective: AI should be understood not merely as a tool or even as a tutor, but as a maker space for thought. This framing matters. The panel referenced a historical analogy: when writing emerged, Socrates feared it would erode memory and weaken thinking. Instead, writing expanded cognition. AI can play a similar role:

  • enabling experimentation,

  • allowing failure and correction,

  • supporting iterative hypothesis testing,

  • turning learning into participation rather than passive consumption.

In this frame, the false binary—“AI does it for the student” vs. “the student does it alone”—collapses. The student participates in thinking, rather than outsourcing it.

Normalizing AI instead of policing it

A practical implication follows: stop pretending AI use can be banned. NYU Abu Dhabi described an approach where AI use is allowed across courses (including engineering), with one key requirement: students must disclose when and how AI was used. This shift removes fear and guilt, dissolves the “catch-and-punish” dynamic, refocuses attention on what actually matters — the quality of reasoning.

Personalization at a scale humans cannot match

Khalifa University shared a concrete signal of what AI-native learning looks like in practice: an always-on AI tutor used at scale — 1,700 students generating over 750,000 questions in a single semester. That level of interaction is structurally impossible in the classical lecture model. It forces a role change: the professor cannot remain a “knowledge broadcaster.”

Instead, their value concentrates in human judgment and context, facilitation and teamwork, discussion and case-based learning, coaching rather than repetition.

Assessment is the real breaking point

When the conversation turned to evaluation, the consensus sharpened: traditional homework and exams are not “threatened by AI.” They are made obsolete by it — and were already misaligned. Two ideas recurred:

  1. Real work is team-based, yet education assesses individuals in isolation for years.

  2. Cheating has always existed; AI didn’t invent it. The real problem is that we use the wrong instruments to measure what we claim to value.

Assessment is moving toward team-based outcomes and roles, visibility into reasoning processes, dialogue and oral defense, continuous evaluation rather than one-shot finals, adaptive rather than identical assessment paths.

Flip classrooms + AI agents: the emerging architecture

A coherent future model emerged across perspectives:

  1. Student + AI agent — personalized practice, foundational mastery, guided reasoning.

  2. Classroom time — discussion, debate, sense-making, collective reasoning.

  3. Professor — coach, diagnostician, and cultivator of judgment.

MBZUAI represents an advanced version of this logic: curricula are rewritten frequently, and students graduate with a persistent AI agent embedded into their professional workflow.

The hardest constraint isn’t technology—it’s people

Institutional transformation will not be limited by models or platforms. It will be limited by faculty fear of replacement, the belief that “struggle equals learning,” academic freedom that resists mandates.

What works instead – removing routine burdens (grading, mechanical tasks), showing successful examples, allowing adoption to grow organically.

Five-year horizon: AI as cognitive infrastructure

Panelists’ forecasts aligned closely:

  • the “guilt” of using AI will largely disappear;

  • AI will become baseline cognitive infrastructure, like search or writing tools;

  • humans will expand the complexity and scope of what they can do — much as writing once did;

  • universities that fail to integrate AI will lose relevance;

  • AI will extend beyond learning into admissions, diagnostics of potential, and individualized trajectories.

What this means for parent tech products

When you translate these educational shifts into parenting and child development, the parallels are direct.

  1. AI as a thinking partner, not an answer engine

    This is not a listings or recommendation service. It is a cognitive workspace for parenting decisions — helping parents reflect, test assumptions, and iterate choices over time.


  2. Reducing anxiety through clarity, not moralizing

    Just as education moves from prohibition to normalization, such services as Growly moves parenting away from guilt-driven “right answers” toward confident, reasoned decisions.


  3. Continuous personalization via evolving profiles

    The AI-tutor logic maps directly onto the feature of evolving child profiles — not a one-time questionnaire, but a living model that updates through interaction.


  4. From “correct choices” to better judgment

    AI-native education optimizes for judgment and sense-making. It applies the same logic to families: not “the best activity,” but the best next decision given the child and context.


  5. The adult role is upgraded, not replaced

    As professors become coaches, parents become more capable navigators. Growly strengthens parental agency rather than outsourcing it.