The Last Chapter
Why Being Able to Stick with the Story Matters
You will be relieved to hear I am on the last chapter of Reshuffle by Sanjeet Choudary. I only listen to it when I am working out at the gym, so it has reached me in fragments, between breaths and repetitions, never consumed in one clean sitting. And yet I stayed with it. I followed the thread. I did not skim the argument or outsource my understanding to a summary. I let the ideas accumulate slowly, chapter by chapter, until the ending landed with quiet precision and stayed with me long after the headphones came off.
You do not need an AI strategy.
It is funny in the way truth often is, slightly absurd, because the entire industry is currently selling nothing but strategies. Strategies to win fast, to scale faster, to pick off low-hanging fruit, to make a quick return before someone else does. What Choudary is pointing to instead is almost boring in its simplicity. Do not paste AI on top of a broken system. Take the whole thing back to the drawing board. Start from first principles. Ask what actually needs to exist, and what does not.
What struck me just as deeply as the argument itself was the fact that it took an entire book to make it. Not a thread. Not a slide deck. Not a keynote condensed into a viral clip. A book. One that requires staying power, patience, and the willingness to tolerate complexity long enough for coherence to emerge. Being able to read a complete book matters. Being able to follow a long argument, to hold ideas in mind across time without constant novelty, is not a quaint intellectual preference. It is a human capacity that lasts a lifetime.
This capacity is the same one that allows someone to: stay in relationship, to steward an organization, to raise a child, to remain ethical under pressure, to see a system whole rather than grabbing at its most profitable fragments. When we lose the ability to stay with a story from beginning to end, we lose more than attention span. We lose coherence.
Developmental science has been remarkably consistent on this point. Sustained attention is not automatic. It is constructed. It emerges through early environments that protect focus, agency, and meaning. Once formed, it becomes a lifelong scaffold for reasoning, ethical judgment, and self-regulation, as shown in decades of research by Michael Posner and Mary Rothbart, and synthesized in applied work by the Center on the Developing Child at Harvard University. This is not a productivity hack. It is a developmental achievement.
This is where Reshuffle quietly converges with Montessori in a way that feels almost uncanny. Choudary is not arguing against technology. He is arguing against superficial adoption. Against the urge to apply AI as a layer of optimization without asking what kind of system we are actually optimizing. That kind of thinking cannot be done in sound bites. It requires sustained engagement. It requires the ability to sit inside uncertainty long enough for clarity to arrive.
Montessori understood this long before algorithms learned to optimize for engagement. The prepared environment was designed to protect deep concentration, extended work cycles, and uninterrupted thinking. Children learn to follow their own long threads. A piece of work unfolds over days, weeks, sometimes months. A question is not answered immediately. It is lived with. This is not preparation for school. It is preparation for life.
Inside Montessori classrooms, especially in the elementary years, this capacity is visible every day. Morning meetings include news of the world. Children bring questions about artificial intelligence, robotics, economics, and ethics into shared conversation. They are not shielded from complexity, but they are protected from speed. Over time, they develop the ability to ask not just what is possible, but what is worth doing.
This matters because the world they are inheriting is already experimenting with extreme versions of efficiency. Companies reduce workforces, fill factories with robotics, replace marketing and design teams with generative systems, while profits concentrate at the top. This structure will not go unnoticed, particularly by generations who have already decentralized media, finance, and community. The question will not simply be whether such systems are profitable, but whether they are livable.
From there, the questions become more intimate. Do we want to know how things are made? Not just whether a snack is organic, but where it was grown, how far it traveled, whether its production relied primarily on human labor or automated systems, whether its design was shaped by human judgment or optimized by AI. Transparency begins to feel less like a feature and more like a moral requirement.
Alongside this comes a reckoning about knowledge itself. In many organizations, the most precious resource is human experience. Some leaders recognize this and hesitate to lose it. Others propose capturing it, paying people to upload their knowledge into AI systems so it can be retained indefinitely. Efficiency meets dignity, and the ethical terrain becomes unstable.
We have seen versions of this story before. Carrie Fisher lived her life without proportionate benefit from the billions generated by the image of Princess Leia, because no one thought to tell her to secure rights to her likeness at the beginning. The system did not protect her. It assumed extraction was normal. Today, similar dynamics are quietly forming around data, creativity, and identity. The difference is that now they reach into childhood.
After decades working with children and educators, I know this without hesitation. Children generate ideas constantly that are patent-worthy, copyrightable, and genuinely original. Inventions. Systems. Artistic works. Ways of seeing. And yet we train children to diminish their own ideas, to call them just something I did, just a thought I had. That language shrinks authorship.
Imagine instead a world where it is easy for children to register ideas in their own name, not to turn childhood into a marketplace, but to affirm origin. This came from you. It matters. The reason this feels radical is because we have built an educational culture that treats children as unfinished until credentials certify them. Their thinking is not considered legitimate until a degree validates it.
This is where the conversation about AI could become genuinely hopeful, and genuinely dangerous. What if, instead of training systems on scraped data saturated with adult fear and bias, we allowed children to contribute in protected, ethical ways? What if a school-based AI began as nothing at all, a closed system, trained only on the poems, stories, questions, and inventions of the children themselves? What if it grew alongside them, reflecting curiosity rather than extraction?
This is not a proposal so much as a thought experiment. One that requires slow thinking. Restraint. Adults who can stay with complexity rather than rushing to monetize it.
And here we return, again, to the last chapter. To the ability to stay with a long argument. To follow a story far enough to understand its consequences. In a culture shaped by speed, extraction, and optimization, the capacity to hold a long narrative may turn out to be one of the most important forms of resistance we have.
A society that cannot stay with a book cannot stay with a child. And a society that cannot stay with a child will struggle to build a future worth inhabiting.
If you wish to follow the research and thinking that inform this work, the books Mapping Montessori Materials for AI Competency Development and Montessori & AI -Volume I are available through my website, katebroughton.com.


As I wrestle with the complexities of AI, I find myself in long overdue conversations about childrens' rights to their IP, their images, their work. Again and again, the reality of AI is forcing us to get clear and to clean up messes we've made. More to consider here...