GeoSANE is a geospatial model foundry that operates directly in model-weight space. Instead of training a downstream model from scratch, GeoSANE learns a shared latent representation over a population of pretrained remote sensing models and uses that representation to generate new model candidates for a target architecture. These generated models can then be evaluated and fine-tuned on downstream tasks.
This repository provides a demo project for running GeoSANE on a downstream remote sensing benchmark. The included notebook walks through the full evaluation pipeline for TIMM backbones on the Sen1Floods11 segmentation task: preparing the downstream dataset, loading a trained GeoSANE checkpoint, generating model candidates, fine-tuning and saving the resulting checkpoint.
geosane-demo.ipynb: end-to-end demo notebookshrp/: the core SHRP library used by GeoSANE for weight tokenization, latent sampling, model reconstruction, evaluation, and fine-tuningdownstream_datasets/: downstream benchmark loaders, including Sen1Floods11, SpaceNet, EuroSAT, DIOR, fMoW, and othersrequirements.txt: broad dependency list for setting up an environmentrequirements-lock.txt: pinned versions from a working environmentanchor_tokenized/: cached tokenized anchor-model datasets generated during evaluationcheckpoints/: fine-tuned model checkpoints written during notebook runs
Create an environment and install dependencies:
python -m venv .venv
source .venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txtIf you want to reproduce the exact environment used for this demo as closely as possible, use:
pip install -r requirements-lock.txtNote: the lock file includes environment-specific PyTorch builds. You may need to install a compatible torch / torchvision pair first and then install the remaining dependencies.
The demo notebook is currently configured for:
- task: segmentation
- downstream dataset: Sen1Floods11
- generated backbone prompt: TIMM backbones such as
swin_s3_base_224.ms_in1k
The downstream dataset file created by the notebook is:
data/sen1flood11_preprocessed.pt
This file is constructed from:
trainset = Sen1Floods11HandLabeledDataset(split="train", resize_to=(224, 224))
testset = Sen1Floods11HandLabeledDataset(split="val", resize_to=(224, 224))The notebook expects a trained GeoSANE run directory. The configured model_path must point to a directory that contains at least:
params.jsoncheckpoint_000300/state.pt
In the notebook, this is configured through:
EXPERIMENT = TimmExperimentConfig(
model_path=Path("/path/to/your/geosane_run"),
checkpoint_rel_path=Path("checkpoint_000300/state.pt"),
...
)After downloading or copying the checkpoint directory, update model_path in geosane-demo.ipynb to your local path.
Running the notebook produces several artifacts:
data/sen1flood11_preprocessed.pt: preprocessed downstream datasetanchor_tokenized/: tokenized anchor-model datasets used during sampling and evaluationcheckpoints/: fine-tuned model checkpoints saved during evaluationmodel_path/notebook_eval_results/: JSON evaluation outputs
If you would like to cite our work, please use the following reference:
- Hanna, Joelle, Damian Falk, Stella X. Yu and Damian Borth. GeoSANE: Learning Geospatial Representations from Models, Not Data., Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2026.
This repository incorporates code from the following source: