>Behind the scenes
The technology behind the talking tree involved more than 10 environmental sensors, measuring wind speed, temperature, humidity, soil moisture, and bioelectric signals from the tree itself using electrodes. These signals revealed how the tree was physically responding to its surroundings—an additional layer of complexity that made the experience feel even more alive.
All this data was processed in real time through a custom large language model (LLM) running locally on a Mac mini—giving the tree a localised brain while significantly reducing the environmental impact associated with cloud-based LLMs. Every part of the system, was designed to run fully on-site without an internet connection, ensuring the experience remained entirely self-contained. Powered by a battery pack, the setup minimized energy consumption and ran as sustainable as possible. The AI interpreted the tree’s 'feelings' and experiences, creating a distinct personality shaped by its environment and unique 150-year history.
This setup enabled a truly interactive experience. People can have live conversations with the tree, asking questions and uncovering how it 'felt' about the changing world around it.
The project shows how creative technology can bring hidden stories to life, turning raw data into something emotional, something human—and maybe even something that helps us care just a little more about the world we’re part of.