MMGIS is one server, one database, and three browser apps — held together by a shared mission configuration file. Strip out the server-only responsibilities (auth, persistence, real-time collaboration, admin editing) and you're left with a static map app.
MMGIS, the short version
MMGIS is NASA's open-source mapping app for planetary science missions — Mars rovers, lunar ops, that sort of thing. Think of it as Google Maps, but for Mars, with a bunch of mission-specific tools layered on top: drawing routes, measuring distances, taking measurements off elevation models, comparing imagery over time.
It's been around long enough to have accumulated a lot of capability, and as a result the codebase is bigger than it looks. The mental model below is the smallest one you can hold and still understand what each piece is doing.
The seven boxes
Think of MMGIS as one server with three browser apps in front of it, plus optional sidecar services and an offline toolbox.
The server (one Node.js process — a single running program that handles incoming requests) is the center. It does five things:
- Serves the browser apps as static files (HTML, JavaScript, CSS sent down to the browser).
- Holds the mission's data — layers, drawn features, user accounts — in a Postgres database (a long-running database program that stores and queries records).
- Exposes that data over an HTTP API (a set of URLs the browser can call to read or write data, getting JSON back).
- Pushes live updates (for collaborative drawing) over a (a persistent two-way connection between browser and server, instead of one request-then-response at a time).
- Optionally forwards certain requests to Python services that handle map tiles and catalogs. (Forwarding a request like this is called proxying — the server receives a request and passes it to a different service behind the scenes, then returns that service's answer.)
The three browser apps are:
- — the main map app. The 2D map, the 3D globe, the side panel with tools. This is what mission users actually look at all day.
- — a separate admin app at
/configure. ops use it to set up the map: which layers exist, where the data lives, who can log in. It's a totally different codebase from Essence; it just shares the same database through the server. - A documentation site — a Jekyll-built docs site. Not very interesting.
The optional Python sidecar services are tile servers and catalog servers — TiTiler, , tipg. They're external open-source projects that do tile-serving better than Node would. MMGIS spawns them as separate processes when enabled, and the server forwards browser requests through to them so the browser only ever talks to one origin.
The offline toolbox is a folder of standalone scripts (mostly Python wrapping GDAL) that take raw mission imagery and chop it into the tiled, indexed format MMGIS expects. These are run by hand on a workstation, never at runtime. They produce the files that the runtime then serves.
What flows between the boxes
A few flows are worth holding in your head, because they're where a refactor lands.
The mission configuration is a JSON blob in the database. Configure writes it. Essence reads it at startup. Everything about how the map looks — layers, tools, colors, defaults — comes out of this file.
Map tile data lives either on disk (served as static files by the Node server) or behind one of the Python tile services (served on demand from cloud-optimized GeoTIFFs).
User-drawn features (annotations, traverse plans, etc.) live in Postgres and are pushed to other connected clients in real time via the WebSocket.
Authentication is a cookie-based login for humans, plus long-lived bearer tokens for scripts. Both checked by the server on every request.
Why this matters for a refactor
If you're thinking about something like a "static mode" deployment — a build that drops a self-contained map onto S3 or CloudFront with no Node server, no database, no auth — you're essentially asking which of the seven boxes you can throw away:
- The server disappears entirely.
- Configure disappears (or stays alive only on the authoring deployment).
- The database disappears.
- Auth and sessions disappear.
- The WebSocket and collaborative drawing disappear.
- The Python sidecar services stay alive, but the static frontend points at them by external URL instead of going through the Node server's proxy.
- The offline toolbox stays alive — it still produces the static tiles and assets.
What's left is the frontend bundle, the mission config JSON (baked in at build time), and the static tile/asset directory. The reading order in this tutorial is designed to make the dependencies between those pieces obvious so you can see what comes out cleanly and what doesn't.
A separate browser app, at the URL /configure, used by mission administrators to set up missions. It doesn't actually render a map — it's a structured form editor for the JSON file that describes the map. Essence reads that file; Configure writes it. They are two different codebases that only share a database.
A persistent two-way connection between browser and server. Unlike a normal HTTP request, the connection stays open, and either side can send messages at any time. MMGIS uses one WebSocket to broadcast drawing edits to everyone else who has the same mission open.
The main MMGIS browser app. It's what users see and interact with — the 2D map, the 3D globe, the image viewer, and the side panel of tools. "Essence" is just the codename for it. Most of this tutorial is really about Essence and the server that feeds it.
A single mission's bundle of settings: which layers exist, where their data lives, which tools are turned on, what the default view should be. Mars and the Moon are different missions, and so are different rovers. The mission is the unit of personalization for MMGIS — when you open the site, you're opening it for a specific mission, and the URL says which one.
A standard format for catalogs of geospatial assets — "here are 500 images of this region, each with these metadata fields, taken at these times." MMGIS can talk to an optional STAC service to browse and serve catalogs.
Components
- Frontend (the map app)The thing users actually see — a 2D map, a 3D globe, an image viewer, plus tools that hang off the side panel. Loaded fresh in the browser every time someone visits the page.
- Backend (the server)One Node process that hosts everything server-side — serves the SPA, exposes the data API, holds sessions and auth, pushes real-time updates, and proxies to optional Python services.
- Configure (the admin app)A separate admin app at /configure for editing the mission's JSON configuration. Doesn't run the map, just edits the file that describes the map.
- Build & dev serverHow the frontend gets bundled into a browser-ready blob, and the quirks of running it in development (two ports, a code-generation step that runs before the bundler).
- The Python sidecar servicesOptional tile and catalog servers (TiTiler, STAC, tipg) that MMGIS proxies through. Best-of-breed Python code MMGIS chose not to reimplement.
- Offline data-prep scriptsA toolbox of standalone scripts that turn raw mission data into the tiled, indexed assets MMGIS expects. Run by humans on a workstation, never at runtime.
- TestingOne test runner (Playwright) that covers both fast unit tests and full browser end-to-end tests. The same setup, just with or without a real server running.
Reference
Cross-cutting addenda — vocabulary, named abstractions, decisions, seams.
- GlossaryThe terms MMGIS prose uses, defined in plain English.
- Cast of charactersThe recurring named things in MMGIS, framed as actors with roles.
- DecisionsThe non-obvious choices behind the architecture, and why they matter.
- SeamsThe boundaries between parts of the system — the places a refactor has to cross.