Working on a new point tracking method or need a faster alternative?
Check out LiteTracker β now with results on natural, non-surgical, benchmarks posted on the repository!
It remains highly competitive at a fraction of the latency.
π github.com/ImFusionGmbH...
Posts by Mert Karaoglu
π arXiv: arxiv.org/abs/2504.09904
π» Code: github.com/ImFusionGmbH/lite-tracker
Many thanks to all collaborators at @imfusion.com and CAMP : Wenbo Ji, Ahmed Abbas, Nassir Navab , @busambenjamin.bsky.social , and Alexander Ladikos.
See you all in Daejeon π°π·
#MICCAI2025
Key highlights:
ποΈ Frame-by-frame, low-latency tracking on commercial GPUs
π >7Γ faster than its predecessor and >2Γ faster than current SOTA
π€ Compatible with CoTracker3 weights β no retraining needed
π High accuracy tracking & occlusion prediction on STIR and SuPer
It reframes tissue tracking as a long-term point tracking problem, extending the CoTracker3 architecture with a set of training-free, runtime optimizations β addressing the latency-accuracy tradeoff bottleneck head-on.
π§΅LiteTracker is accepted to MICCAI 2025!
LiteTracker is a low-latency tissue tracking method designed for real-time surgical applications.
π STIR Dataset: arxiv.org/abs/2309.16782
π Baseline results on the last year's challenge:
- arxiv.org/abs/2503.24306
- arxiv.org/abs/2504.09904
Many thanks to our sponsors @imfusion.com and Intuitive for their support and to MICCAI Society and EndoVis for their hosting!
π Details on the submission, tasks and prizes: stir-challenge.github.io
π Dataset from last year's challenge: stir-challenge.github.io//stirc-2024/
#MICCAI2025 #EndoVis #STIRChallenge
π§΅STIR Challenge returns for MICCAI 2025
We're inviting researchers to submit their point tracking methods on our challenging surgical tissue tracking benchmark.
If you're working on robust and efficient point tracking, this is a great opportunity to benchmark and compete.
π€ Shared first authorship with Nicolas Schischka @mertkaraoglu.bsky.social β huge thanks to the entire team for the incredible collaboration and hard work!
#ICRA2025 #RA_L #ComputerVision #NeRF #Robotics #CameraLocalization #DynaMoN #AI #MIRMI #TUM #FAU
π’ Exciting news! Check out DynaMoN, our new paper on motion-aware, robust camera localization for dynamic neural radiance fields, now accepted to #IEEE RA-L! π
Huge thanks to the team for the amazing effort!
π Project page: hannahhaensen.github.io/DynaMoN/
@imfusion.com #TUM #CAMP #MIRMI #FAU