Everything on Earth is connected. A solar flare triggers aurora, tectonic forces drive both earthquakes and eruptions, and ocean warmth fuels the storms that light up the sky. Most dashboards show these in isolation — Earth Pulse shows them together, watching for the moments when independent signals align.
A single-page 3D globe pulling 10 live data feeds from USGS, NASA, NOAA, and other free public APIs. Every data point on the globe is clickable for details, and a cross-feed correlation engine detects when signals converge.
Planet Brain is a browser-local AI narrator powered by WebLLM. A small language model (SmolLM2–360M) loads directly into your GPU via WebGPU — no server, no API keys, no data leaves your device. On desktop, a larger model (Phi–3.5) silently upgrades in the background for deeper analysis.
It reads all 11 live feeds in real time, narrates cross-feed patterns every 45 seconds, and lets you ask questions about what's happening on the planet right now. Click a data point and Planet Brain adds contextual AI analysis alongside the raw data.
Built with Three.js and zero server-side dependencies — no framework, no build step. All data is fetched directly from public APIs in your browser. The correlation engine watches for patterns: geomagnetic storms, seismic clusters, volcanic-seismic coupling, and solar flare impacts. Planet Brain runs entirely client-side via WebGPU — if your browser doesn't support it, the rule-based correlation engine provides narration instead.