> Earth Interface Design Experiment We imagine the Earth Interface Design Experiment (IDE.Earth) as an open science project that combines visual storytelling with interactive exploration. The project brings together two core elements: a five billion year Tectonic Video of Earth’s history (and projected future) and a 3D globe visualisation using globe.gl. The globe allows users to view continents, geological features and tectonic plates at different points in time, with the ability to overlay points, text, shapes and animations. These elements link directly to Wiki Pages (or generate new ghost pages if none exist) so that every point on the globe can open into deeper research, narrative or visualisation. The design experiments invite us to test what kind of interactivity and visualisation can best bring these historical or future Earth events alive. We imagine IDE.Earth as a framework where anyone can run experiments, combining Generative AI with scientific datasets to produce globe textures and timeline snapshots. An Agentic Pipeline processes open data (for example, plate tectonics, ice sheets, flora and fauna) to create images for specific points in Earth’s history. These images can be displayed as flat map projections or on the 3D globe, enabling experiments that range from simple overlays to complex simulations. Alongside the globe sits the Video Timeline, an interactive sequence beginning with the tectonic video but enriched by additional datasets, visual overlays, and clickable timeline points. These timeline points not only trigger wiki pages but can also reshape the layout of the screen to foreground interactive graphics, short films or other data-driven visualisations. Together, the globe and timeline allow for a rich combination of maps, videos and interactive storytelling. We imagine IDE.Earth supporting a wide community of schools, students and Vibe Coders who contribute to jams, experiments and end-of-term performances. Students might create short AI-generated clips (such as 16-second reconstructions of extinct animals) based on federated prompts, while demo scene artists add experimental visualisations, musical VJ sets or data-driven artworks. These contributions are archived in a Federated Wiki environment backed by IPFS and permanent web infrastructure, with open science partners and libraries helping preserve the datasets. The technology integrates with desktop and mobile apps, enabling prompts and data to be forked, improved and re-used. The project culminates in shared shows (such as a live-streamed VJ event at “the Restaurant at the End of the Universe”) where the best contributions are performed, remixed and archived. In this way IDE.Earth evolves each year as a living, federated archive of Earth’s history, future and imaginative narratives.
https://transcript.myth.garden/assets/ide-earth/IDE.earth.wav IDE.earth
# Tidied Transcript **Tweet-length summary** The Earth Interface Design Experiment (IDE.Earth) is an open science platform combining a five billion year tectonic video, an interactive 3D globe and a federated archive. It brings together datasets, AI tools and creative coding to let schools, artists and researchers co-create living narratives of Earth’s past and future. --- # Earth Interface Design Experiment We imagine the Earth Interface Design Experiment (IDE.Earth) as a series of vibe-coded iterative Design Experiments presented under a consistent visual umbrella. This umbrella has two main parts. The first is the Tectonic Video showing Earth’s five billion year history, stretching from its beginnings through to possible futures. The second is a 3D Globe, a spherical visualisation that enables interactivity and navigation across the planet. These two components give us both a broad temporal narrative and a spatial view of Earth. The globe is built using globe.gl. It provides a zoomable representation of Earth but is limited in detail compared to Google Earth. At resolutions closer than one kilometre the visual quality becomes fuzzy. Instead of detailed zooming, the focus is on the whole planet, its continents, countries, cities, geological features and tectonic plates at a given point in time. On top of this base map we can add interactive elements such as spikes, data points, text labels, polygon tiles, hexagons and animated shapes. These elements are clickable. When selected, they open associated Wiki Pages. If the page does not already exist, a ghost page is created for contributors to write. In this way the globe becomes a gateway into deeper stories and data about Earth’s events, whether a volcanic eruption, a meteor strike or a historical climate change episode. The purpose of IDE.Earth is to run Interface Design Experiments exploring what kinds of points to place on the globe, what kinds of interactions to allow and what kinds of visualisations to display on the linked wiki pages. For example, clicking on a volcano might open a wiki page showing interactive models of eruptions, or clicking on a meteor impact could display scientific simulations and artistic interpretations. These experiments are deliberately open-ended and invite contributors to test alternative ways of layering visual and narrative content on top of the globe. Importantly, the globe is textured using large PNG images. An Agentic Pipeline is being set up to process scientific data and generate these texture maps. This pipeline combines open datasets with Generative AI to create snapshots of Earth at specific points in history or in projected futures. These snapshots include tectonic plate configurations, ice sheet coverage and environmental features. Over time, experiments will improve the accuracy and richness of these visualisations. The pipeline also allows us to work with flat map projections, over which we can overlay points and data, offering a Google Maps-like interactive experience. This toolkit is designed to run on home computers so that anyone can experiment locally. We imagine a broad range of Vibe Coding Experiments emerging from this setup. Coders, artists and students can use the flat maps and globe to test code-based visualisations, art-driven graphics, demo scene performances and VJ-style shows. These outputs become part of the Hitchhiker.Earth project, giving visual storytelling forms to Earth’s history and future. The experiments are not limited to science alone but extend into creative performances and speculative narratives. The second major element of IDE.Earth is the Video Timeline. This is an interactive video sequence that begins with the five billion year tectonic video. We already have a prototype of this. Timeline points can be added so that when a moment in the video plays, related wiki pages are displayed. Beyond simple page links, the timeline will support overlays, alternative layouts and embedded data visualisations. For instance, the video might shrink to a mini-map while the rest of the screen displays interactive graphics. Buttons or game-like controls will allow users to jump to particular moments, pause or continue. The timeline thus becomes a hybrid of film, simulation and wiki interactivity. We also experiment with short film formats within the video timeline. Generative AI can add new material, timeline overlays can enrich the sequence, and multiple film clips can be stitched together. For example, a silent-movie style format could alternate between 16-second clips and explanatory text. Overlays and data points are shared between the 3D globe and the video timeline so that both visual layers remain consistent. Initially the timeline will use globe snapshots but over time it may evolve into fully animated globes embedded within the video. These two elements – globe and timeline – form the basis of IDE.Earth and integrate with the larger Hitchhiker.Earth story. IDE.Earth is intended as an independent open science project. It is designed for use in schools and classrooms but also works for personal exploration. It runs on mobile phones, desktop computers and projectors. It can support both private note-taking and collaborative teaching. The technology draws on Python-based datasets from the University of Sydney tectonics project and from other open science sources including Cambridge-led research, Our World in Data and environmental data projects by Samari McCarthy. Open data packs are created from these sources. Students and artists are invited to participate in jams, contributing experiments that can be shown in end-of-term VJ performances. In this way IDE.Earth is refreshed each year with new data, stories and performances. The project integrates open data, Python and R notebooks, and Federated Wiki libraries. It also connects with the Guides desktop applications and mobile tools. This integration allows contributors to add wiki pages directly onto the timeline and globe. These tools support Scene and Beat technologies already used in script-writing and collaborative sci-fi storytelling. The same methodology enables speculative timelines, such as imagining narratives set three million years ago or half a million years in the future. IDE.Earth is therefore not only a scientific tool but also a creative engine for time travel storytelling. A key element is the Federated Data Model. This model uses IPFS and the Permanent Web to archive open science data robustly. The aim is to partner with organisations such as the British Library, the Wellcome Trust and other foundations to support this archiving. Contributions are fact-checked and peer-reviewed, ensuring reliability. At the same time, the platform also welcomes speculative and imaginative storytelling, balancing the Marvin-like scepticism of rigorous science with the Zaphod-like energy of creative exploration. This dual approach allows factual science and playful fiction to coexist. Another technological focus is the use of Federated Prompts. For example, students might write prompts to reconstruct the extinct moa of New Zealand within a historical context of tectonics, ecology and human arrival. These prompts are stored as federated wiki pages, making them forkable and reusable. They can be passed to AI tools such as Sora, open-source visualisation models or home-lab setups. The AI then generates video based on the prompt, returning metadata that records the input and output. These results are added to the federated archive with a wiki front end, enabling contributors to revisit, fork and improve prompts over time. A wiki page can even include a “generate video” button that triggers new outputs directly. In practice, much of the required technology already exists. Setting it up is straightforward – a few days’ work to polish and integrate – but the challenge lies in social organisation and teaching participants how to use Federated Wiki effectively. The target is to have the system ready for a January launch, enabling workshops and classes to begin producing experiments and archives early in the year. Another dimension of IDE.Earth is the Live TV aspect. With the 3D globe, archived videos and 16-second clips, participants can create live VJ shows. These shows remix archived content in real time, with performances live-streamed and recorded. A final-year show might take place at a venue styled as “the Restaurant at the End of the Universe”, combining live visuals, data-driven animations and audience participation. Demo scene artists using platforms like cables.gl can also contribute performances. These are then packaged as wiki entries, added to the timeline and accessible through the globe. Through these activities, IDE.Earth enables an evolving Anarchy Archive. This archive holds both rigorous scientific data and experimental artistic contributions. It is open to contributions from schools, researchers, coders and artists worldwide. By combining flat map and globe interfaces with a federated, decentralised archive, IDE.Earth provides a living interface to Earth’s story – one that is continually expanded by open data, scientific research and imaginative storytelling. ```
# Assets
ide-earth
# See also - {{folderPageTitle}}