feat: Add catmaid spatially indexed skeletons (read-only)#119
feat: Add catmaid spatially indexed skeletons (read-only)#119afonsobspinto wants to merge 54 commits intomasterfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR introduces support for spatially indexed skeleton rendering in Neuroglancer, specifically adding a CATMAID data source integration for loading skeletons organized in spatial chunks.
Changes:
- Adds spatially indexed skeleton sources that load skeleton data in spatial chunks (similar to volume rendering) with LOD support
- Implements CATMAID data source provider for fetching skeleton data from CATMAID servers
- Adds UI controls for hidden object opacity and skeleton LOD level
Reviewed changes
Copilot reviewed 21 out of 21 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| src/skeleton/skeleton_chunk_serialization.ts | New utility module for serializing skeleton chunk data for transfer between backend and frontend |
| src/skeleton/gpu_upload_utils.ts | New utility for uploading vertex attribute data to GPU as 1D textures |
| src/skeleton/frontend.ts | Extended with new spatially indexed skeleton layer classes and rendering logic |
| src/skeleton/base.ts | Added RPC IDs for spatially indexed skeleton layers |
| src/skeleton/backend.ts | Added backend implementation for spatially indexed skeleton chunks with LOD support |
| src/segmentation_display_state/frontend.ts | Added hiddenObjectAlpha field for controlling opacity of non-visible segments |
| src/layer/segmentation/layer_controls.ts | Added UI controls for hidden opacity and LOD |
| src/layer/segmentation/json_keys.ts | Added JSON keys for new configuration options |
| src/layer/segmentation/index.ts | Integrated spatially indexed skeleton layers into segmentation user layer |
| src/datasource/index.ts | Extended DataSubsource type to include new skeleton source types |
| src/datasource/graphene/frontend.ts | Fixed type checking for meshSource RPC access |
| src/datasource/enabled_frontend_modules.ts | Registered CATMAID frontend module |
| src/datasource/enabled_backend_modules.ts | Registered CATMAID backend module |
| src/datasource/catmaid/register_default.ts | CATMAID provider registration |
| src/datasource/catmaid/register_credentials_provider.ts | CATMAID credentials provider registration |
| src/datasource/catmaid/frontend.ts | Frontend implementation for CATMAID data source |
| src/datasource/catmaid/credentials_provider.ts | Credentials provider for CATMAID authentication |
| src/datasource/catmaid/base.ts | Base parameter classes for CATMAID sources |
| src/datasource/catmaid/backend.ts | Backend implementation for CATMAID skeleton fetching |
| src/datasource/catmaid/api.ts | CATMAID API client implementation |
| package.json | Added package.json exports for CATMAID modules |
Comments suppressed due to low confidence (1)
src/datasource/catmaid/frontend.ts:1
- The conditional logic for determining sources is inconsistent. If
mesh instanceof CatmaidMultiscaleSpatiallyIndexedSkeletonSourceis true, passingallSources[0](an array) is incorrect as SpatiallyIndexedSkeletonLayer expects a single source or an array of SliceViewSingleResolutionSource. This should passallSources[0][0].chunkSourcein both cases or adjust the constructor signature.
/**
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 26 out of 26 changed files in this pull request and generated 9 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
e6f163b to
5bea93f
Compare
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
…ionality is active
seankmartin
left a comment
There was a problem hiding this comment.
Reviewed the new catmaid datasource and the skeletons api, looks good! I will review the parts that are more intertwined with the rest of neuroglancer later after I mess around with the neuroglancer build as I think it will be clearer after that
| [key: string]: any; | ||
| } | ||
|
|
||
| export interface CatmaidStackInfo { |
There was a problem hiding this comment.
Do we use all these optional properties in some cases or have docs we can link to about them? For e.g. I've seen orientation commonly as 0. What does it mean if non-zero, do we need to reject it? If we never use some of these optional properties we could likely remove them
| } | ||
|
|
||
| export const credentialsKey = "CATMAID"; | ||
| const DEFAULT_CACHE_GRID_CELL_WIDTH = 25000; |
There was a problem hiding this comment.
These could maybe benefit from a comment about what using the default grid would mean (I think single LOD with chunks in this shape?)
Also is this default coming from CATMAID or decided by us? I wonder if we could/should suggest slightly more isotropic shaped chunks if we are picking these sizes
There was a problem hiding this comment.
As an aside, we likely need to write more detailed docs later - so on the code side I do mean something fairly brief about it
| @@ -0,0 +1,474 @@ | |||
|
|
|||
There was a problem hiding this comment.
Overall this catmaid API looks good to me! V small points:
- Missing license at top of file.
- In general I think a linting and formatting pass was missed. These commands are:
npm run lint:fix && npm run format:fix
uvx nox -s lint format mypy
You can see on #120 I applied these with a git ignore revs so we can keep the blame clean (of course we can remove the ignore revs before making a PR against google main branch since there all the commits get squashed anyway).
When possible it would be good to run the linting/formatting on this and also on the editable skeletons
| @@ -0,0 +1,57 @@ | |||
| import { | |||
| @@ -0,0 +1,183 @@ | |||
| /** | |||
| * @license | |||
| * Copyright 2024 Google Inc. | |||
There was a problem hiding this comment.
Date tends to be 2024 on these files, would expect either 2025 or 2026 depending on file date in what we've made here. So we can do 2026
| import { Borrowed } from "#src/util/disposable.js"; | ||
| import "#src/datasource/catmaid/register_credentials_provider.js"; | ||
|
|
||
| const METERS_PER_NANOMETER = 1e-9; |
There was a problem hiding this comment.
Would suggest to just do this inline in coordinateScaleFactors later in this file, 1e-9 is used across the code base like that. And your comment above where this is used is clear enough to understand the 1e-9 usage. But if want to keep the var, suggest changing name to NANOMETER_TO_METER = 1e-9`
| @@ -0,0 +1,9 @@ | |||
| import { registerDefaultCredentialsProvider } from "#src/credentials_provider/default_manager.js"; | |||
| const meshSource = this.getMeshSource(); | ||
| // `DataSubsource.mesh` is widened for spatial skeleton support, but Graphene | ||
| // segment updates still target mesh sources. | ||
| const meshSource = this.getMeshSource() as MeshSource | undefined; |
There was a problem hiding this comment.
I think this would be more appropriate in getMeshSource instead
| (subsource) => subsource.id === "mesh", | ||
| )[0]; | ||
| if (meshSubsource) { | ||
| return meshSubsource.subsource.mesh; |
There was a problem hiding this comment.
Here is where we could cast to MeshSource (the comment you have on 1362 is good as to why, I just think it's better to do it here in case getMeshSource is used elsewhere
There was a problem hiding this comment.
As an aside, does graphene also work with multiscale mesh sources? Are we narrowing the type too much?
| const vertexAttributeTextures: (WebGLTexture | null)[] = []; | ||
| const numAttributes = vertexAttributeOffsets.length; | ||
|
|
||
| for (let i = 0; i < numAttributes; ++i) { |
There was a problem hiding this comment.
Leaving a note to check how many attributes usually, could run into texture limits
Closes:
CATMAID Datasource and Spatially Indexed Skeletons
This document explains how the CATMAID datasource is wired up and how spatially
indexed skeletons are loaded, serialized, and rendered.
CATMAID datasource implementation (
src/datasource/catmaid/)Entry points and registration
register_default.tsregisters the provider under thecatmaidscheme. TheURL format is expected to be
catmaid://<base_url>/<project_id>.Credentials flow
credentials_provider.tsdefinesCatmaidCredentialsProvider, which fetchesan anonymous API token from
<serverUrl>/accounts/anonymous-api-token.fetchOkWithCredentialsin the API client to retryon 401/403 and request a refreshed token when needed.
API client (
api.ts)CatmaidClientwraps all CATMAID endpoints used by the data source.Key operations:
listSkeletons()-> list of skeleton IDsgetSkeleton(skeletonId)-> full skeleton as nodes fromskeletons/<id>/compact-detailfetchNodes(boundingBox, lod)-> spatially indexed nodes fromnode/listusing msgpack; supports multi-LOD data in the responsegetDimensions()andgetResolution()-> stack metadata, including voxelsize and translation
getGridCellSizes()-> grid cache configurations (from custom metadata object); defaults to the hard-codedgrid size if none is present
Provider wiring (
frontend.ts)CatmaidDataSourceProvider.get()resolves the source, builds the coordinatespace, and creates the sub-sources:
Parse URL
catmaid://.httpprefix.Create
CatmaidClientand fetch metadata in parallel:dimensions,resolution,gridCellSizes,skeletonIds.Coordinate space setup
resolution(nm per voxel) to meters.translationanddimensionto establish voxel-space bounds.Create skeleton sources
CatmaidMultiscaleSpatiallyIndexedSkeletonSourceprovides a multiscale slice-view chunk source for spatially indexed
skeleton data.
CatmaidSkeletonSourceloads full skeletons by ID.Create segment properties for skeleton labels
BigIntand stored in aSegmentPropertyMap.Subsources returned
skeletons-chunked(spatially indexed)skeletons(full skeletons)properties(segment labels)bounds(data bounds)Multiscale grid mapping
CatmaidMultiscaleSpatiallyIndexedSkeletonSource.getSources():ChunkLayoutandSliceViewChunkSpecification.chunkToMultiscaleTransformthat scales each grid size relative tothe smallest grid size.
useChunkSizeForScaleSelection = trueso the backend can bias scaleselection based on chunk sizes (rather than voxel size alone).
Spatially indexed skeletons (
src/skeleton/)Data model and chunk serialization
The spatially indexed path uses slice-view chunking but renders as a mesh-like
edge list.
Key data types:
SpatiallyIndexedSkeletonChunk(frontend and backend)vertexPositions(Float32 vec3)vertexAttributes(packed bytes)indices(Uint32 pairs, each edge is two vertex indices)missingConnections(parent edges that cross chunk boundaries)nodeMap(nodeId -> vertexIndex mapping)SkeletonChunkDataserialization packs vertex positions and attributes into asingle
Uint8ArraywithvertexAttributeOffsetsfor GPU upload.Serialization (
skeleton_chunk_serialization.ts):vertexAttributeOffsetsdescribes byte offsets of each attribute.missingConnectionsandnodeMapare serialized alongside.Chunk download (CATMAID backend)
CatmaidSpatiallyIndexedSkeletonSourceBackend.download():chunkGridPositionandchunkDataSize.CatmaidClient.fetchNodes(bbox, currentLod).vertexPositions: XYZ for each node.vertexAttributes: segment ID only (stored as float).indices: edge pairs from parent-child relations.nodeMap: maps CATMAID node IDs to vertex indices.Backend chunk scheduling and LOD
SpatiallyIndexedSkeletonRenderLayerBackend:skeletonLodas a shared watchable value.slider move.
recomputeChunkPriorities():renderScaleTarget,effectiveVoxelSize,chunkLayout.sizeifuseChunkSizeForScaleSelectionis enabled.forEachVisibleVolumetricChunkand applies a scalepriority offset (
SCALE_PRIORITY_MULTIPLIER).Rendering pipeline overview
Rendering is split into two paths:
Regular skeleton layer (
SkeletonLayer)Spatially indexed skeleton layer (
SpatiallyIndexedSkeletonLayer)consistent color usage and visibility filtering.
Spatially indexed draw algorithm (per chunk, per segment)
SpatiallyIndexedSkeletonLayer.draw()performs per-segment rendering to ensurecorrect color assignment and visibility filtering:
Build a global node map across all loaded chunks to resolve inter-chunk
parent references.
For each chunk:
getVisibleSegments).objectAlphaorhiddenObjectAlphabased on visibility.uColorset to the segment color.Per-segment caching
generation, so repeated frames can reuse GPU textures.Inter-chunk edges
missingConnectionsare tracked but not rendered yet.Improvements:
1) Per-draw global node map construction
Detail:
SpatiallyIndexedSkeletonLayer.draw()rebuilds a global node mapevery frame across all loaded chunks.
Risk: This scales with total loaded vertices and can become expensive.
2) Per-segment compaction and buffer swapping
Detail: For each segment in each chunk, the code:
Risk: This is CPU-heavy and generates many short-lived GL buffers, which can
cause stalls and GC pressure for large skeleton sets.
3) Inter-chunk edges are not rendered
Detail:
missingConnectionsare tracked, but the renderer does not drawedges that cross chunk boundaries.
Risk: Skeletons that cross chunk boundaries appear disconnected.