Linden Trees
Vision, Time, and What AI Briefly Let Me See
I went walking this morning and ended up at a place I know well, a relatively new athletic facility with young planting all around it. Everything is in that early, uncertain stage where you can see the intention but not yet the thing itself. You can read what the landscape architects were thinking. You can feel the logic of the design. But the fullness of it, the shade and the canopy and the way it will hold a summer afternoon in twenty years, none of that is visible yet. It is all still ahead, still becoming.
I stood there for a while. I could hear birds already finding their place.
The light was doing what light does when it moves through thin young branches, which is something I find very hard to describe and very easy to feel. And I found myself thinking about the people who designed this, who chose these particular trees, who placed everything with a future community in mind. People who will not necessarily be there to see it. People who were seeing it anyway, clearly, in their minds, long before the first seedling went into the ground. There is something almost devotional about that. To plant for strangers. To make something beautiful for people you will never meet, who will sit in the shade of your thinking without ever knowing your name.
I took some photographs. And then, almost as an experiment, I asked an AI tool to identify one of the trees. It came back quickly: linden, it said. Little leaf linden, Tilia Cordata. Commonly planted in urban and streetscape settings. Heart-shaped leaves. Those pale wing-like bracts, the light green strips I had been looking at without knowing what they were, are actually a defining feature of the species. In a few weeks, it told me, those clusters would open into small fragrant yellowish flowers, and the bees would come. Linden honey is prized for exactly this reason. The tree had been chosen deliberately. It is resistant to certain pressures. It responds well to pruning. It knows, in its slow vegetable way, how to behave in a landscape designed for people.
Then I asked what the space would look like when the trees were fully grown.
Within a few seconds I was looking at it. Not a sketch, not an approximation, but a rendered image of the place I was standing in, with a continuous canopy overhead, the individual trees merged into what the AI called, in its own language, one architectural organism. The string lights I had noticed strung between the young trunks were now threaded through leaves. The walkway had become a cool shaded corridor. The seating area, which right now reads as open and a little exposed, had become what the AI described as a fully shaded outdoor room. I was standing in a real place holding a small rectangle and it was showing me the future. I was almost moved to tears. I said, out loud, to no one in particular, wow.
This is not, strictly speaking, the first time a future landscape has been made visible. Architects have always produced renderings. Landscape designers have always brought drawings and digital visualizations to the pitch, images of what the mature space will look like, made by specialists for clients in a boardroom. The vision has existed. But it has existed for the few people in that room, at that moment, as part of a professional transaction.
What is different now is the phone in my pocket. I did not commission a rendering. I did not attend a presentation. I stood in a car park on a Tuesday morning, took three photographs, and within seconds was looking at the future of that place, assembled from the accumulated knowledge of everyone who has ever studied how a linden grows. That capability, to see forward in time with that kind of specificity and that kind of ease, is now available to anyone, anywhere, at almost no cost. That is not a small thing. That is a genuinely new relationship between human beings and the futures they are always, quietly, making.
Prompt from an original photo not shared: Show me an image of this place with the trees full grown.
What I was looking at, I realized, was the architect’s vision. Not my projection of it, not a guess, but something assembled by AI from an enormous body of scraped and accumulated human knowledge about how these trees grow, how they fill space, how they behave across seasons and decades. This is the beauty of our collective voice: In autumn they go yellow. In winter the branching becomes sculptural. Over time, planted as they are in rows both sides of a walkway in the carpark, when full grown their crowns will touch and overlap and close into a ceiling. Right now you see individual trees. Later, you will perceive one architectural organism.
AI had access to all of that knowledge, the collective understanding of botanists and landscape ecologists and urban planners and everyone who has ever documented how a linden grows in a place like this, and it gave it back shaped into something I could see on a phone screen on a Tuesday morning. The vision that had lived in someone else’s mind, the vision they had worked toward without being able to quite show anyone, was suddenly briefly present.
And here is the thing that I keep sitting with. I was moved to tears by someone else’s vision. Not by a sunset. Not by something nature made entirely on its own terms. By the documented intention of another human mind, rendered visible by a machine.
I do not think that this has been possible before, not in quite this way. To stand in a place and see, superimposed over the present moment, the future that another person imagined and worked toward and will probably never fully witness.
That is genuinely new. I am not sure we have language for it yet.
I have been thinking for some time about what it means to bring humanity back to humanity, and about the strange fact that what seems to be required for that right now is technology. That sounds like a paradox and I suppose it is one. But what AI is doing, at its best, is carrying collective human knowledge forward in time, not just across space. It has gathered what we know and held it and is now giving it back to us in forms we can actually use, in a moment when we need it. The humanity was already in it. The technology is the window.
The AI noted, at the end of its response, one practical consideration worth knowing now: lindens can attract aphids, which produce a sticky residue that settles on whatever is below. On the cars parked nearby. In another area there are tables where people will sit. Not a dealbreaker, it said. But part of the long-term reality of this design. I found that oddly moving too, that detail. The architects knew this and chose these trees anyway. The gathering matters more than the inconvenience. The vision holds even knowing what comes with it.
Prompt from an original photo not shared: Show me an image of this place with the trees full grown.
I keep returning to the birds.
They had already found their place.
They were not waiting for the canopy to close.
They were there in the young branches, in the intention of the thing, already making it home.
The AI’s summary stayed with me: right now, trees placed along a path. Fully grown, a living corridor, a shaded plaza, social architecture built out of time. That last phrase is the one I cannot put down. Built out of time. As if time itself is the material. As if patience is a form of craft. The seedlings are already the trees. We just cannot see it yet.
Now, briefly, we can.
I think about children. I always do. A child who plants a seed has always wondered what it will become, and has always had to wait, or imagine, or be told. Now they can see. They can stand in a young garden, ask a question, and hold the future in their hands within seconds. That is not a shortcut. That is a new kind of material to think with. The prepared environment just got larger. The playground, as it turns out, includes time itself.
If you wish to follow the research and thinking that inform this work, the books Mapping Montessori Materials for AI Competency Development and Montessori & AI -Volume I are available through my website, katebroughton.com.


