Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions 001713/IBL-Widefield/public_demo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# IBL Widefield - Brain-wide representations of prior information in mouse decision-making

This example notebook demonstrates how to access the dataset published at [DANDI:001713](https://dandiarchive.org/dandiset/001713/draft).

This study provides a brain-wide survey of representations of prior information during mouse decision-making. Using a standardized task, the International Brain Laboratory (IBL) recorded activity across multiple brain regions while mice performed a perceptual decision-making task where the probability of a stimulus appearing on the left or right changed across blocks. This data contains the raw widefield imaging data used to analyze how prior expectations about the environment are represented across the cortex.

## Installing the dependencies

```bash
git clone https://github.com/dandi/example-notebooks
cd example-notebooks/001713/IBL-Widefield
conda env create --file environment.yml
conda activate 001713_demo
```

1,932 changes: 1,932 additions & 0 deletions 001713/IBL-Widefield/public_demo/anatomical_localization_widefield.ipynb

Large diffs are not rendered by default.

15 changes: 15 additions & 0 deletions 001713/IBL-Widefield/public_demo/environment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
name: 001713_demo
channels:
- conda-forge
dependencies:
- python==3.13
- ipywidgets
- pip
- pip:
- dandi
- jupyter
- matplotlib
- pynwb
- remfile
- plotly
- ibllib
55 changes: 55 additions & 0 deletions 001713/IBL-Widefield/public_demo/load_nwb_utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
# Core data manipulation and analysis
from pathlib import Path

# NWB access
import h5py

# DANDI access
import remfile
from dandi.dandiapi import DandiAPIClient
from pynwb import NWBHDF5IO


def load_nwb_from_dandi(dandiset_id, subject_id, session_id, description):
"""
Load NWB file from DANDI Archive via streaming.
"""
pattern = f"sub-{subject_id}/sub-{subject_id}_ses-{session_id}_desc-{description}*.nwb"

with DandiAPIClient() as client:
client.dandi_authenticate()
assets = client.get_dandiset(dandiset_id, "draft").get_assets_by_glob(pattern=pattern, order="path")

s3_urls = []
for asset in assets:
s3_url = asset.get_content_url(follow_redirects=1, strip_query=False)
s3_urls.append(s3_url)

if len(s3_urls) != 1:
raise ValueError(f"Expected 1 file, found {len(s3_urls)} for pattern {pattern}")

s3_url = s3_urls[0]

file = remfile.File(s3_url)
h5_file = h5py.File(file, "r")
io = NWBHDF5IO(file=h5_file, load_namespaces=True)
nwbfile = io.read()

return nwbfile, io


def load_nwb_local(directory_path, subject_id, session_id, description):
"""
Load NWB file from local directory.
"""
directory_path = Path(directory_path)
pattern = f"sub-{subject_id}/sub-{subject_id}_ses-{session_id}_desc-{description}*.nwb"
matches = list(directory_path.glob(pattern))
if len(matches) != 1:
raise ValueError(f"Expected 1 file, found {len(matches)} for pattern {pattern}")
nwbfile_path = matches[0]

io = NWBHDF5IO(path=nwbfile_path, load_namespaces=True)
nwbfile = io.read()

return nwbfile, io
Loading
Loading