OrganoidTracker 2.0 for 3D cell tracking [paper] [website] [github]

What is this?:

  • Test the performance of our pre-trained networks on your data.
  • We implement only the initial cell detection step, but this is a good performance indicator for the other steps.
  • Does not work? OrganoidTracker 2.0 allows for the easy creation of ground truth datasets and training of new neural networks.

You may need to login/refresh for 5 minutes of free GPU compute per day.

Detection model (with resolutions)

Will add more models later!

0.2 2
0.2 2

Click on an example to try it:

Notes:

  • You can load and process 3D tifs in the following dimensions: (T),Z,(C),Y,X. We automatically pick the first timepoint.
  • Without GPU access, cell detection might take ~30 seconds.
  • Locally OrganoidTracker wil run faster: ~2 seconds per frame on a dedicated GPU, ~10 seconds on a CPU.
  • Caveats:

  • For this demo, an agressive background subtraction step is implemented before prediction, which we find benefits most usecases. For transperency, users have to preprocess the data themselves in OrganoidTracker 2.0.
  • Because of incompatibilities between TensorFlow and HuggingFace the models here are trained with the upcoming PyTorch version of OrganoidTracker (currently in beta). There might be performance differences when using the TensorFlow-versions presented in our paper.
  • References:

  • The blastocyst sample data is taken from the BlastoSPIM dataset (Nunley et al., Development, 2024): [website], [paper]
  • The c Elegans sample data is taken from the Cell Tracking Challenge (Murray et al., Nature Methods, 2008): [website], [paper]