CholecTriplet2022: Show me a tool and tell me the triplet – an endoscopic vision challenge for surgical action triplet detection

by   Chinedu Innocent Nwoye, et al.

Formalizing surgical activities as triplets of the used instruments, actions performed, and target anatomies is becoming a gold standard approach for surgical activity modeling. The benefit is that this formalization helps to obtain a more detailed understanding of tool-tissue interaction which can be used to develop better Artificial Intelligence assistance for image-guided surgery. Earlier efforts and the CholecTriplet challenge introduced in 2021 have put together techniques aimed at recognizing these triplets from surgical footage. Estimating also the spatial locations of the triplets would offer a more precise intraoperative context-aware decision support for computer-assisted intervention. This paper presents the CholecTriplet2022 challenge, which extends surgical action triplet modeling from recognition to detection. It includes weakly-supervised bounding box localization of every visible surgical instrument (or tool), as the key actors, and the modeling of each tool-activity in the form of <instrument, verb, target> triplet. The paper describes a baseline method and 10 new deep learning algorithms presented at the challenge to solve the task. It also provides thorough methodological comparisons of the methods, an in-depth analysis of the obtained results, their significance, and useful insights for future research directions and applications in surgery.


page 3

page 5

page 8

page 10

page 12

page 16

page 18


Surgical Action Triplet Detection by Mixed Supervised Learning of Instrument-Tissue Interactions

Surgical action triplets describe instrument-tissue interactions as (ins...

CholecTriplet2021: A benchmark challenge for surgical action triplet recognition

Context-aware decision support in the operating room can foster surgical...

Rendezvous in Time: An Attention-based Temporal Fusion approach for Surgical Triplet Recognition

One of the recent advances in surgical AI is the recognition of surgical...

The SARAS Endoscopic Surgeon Action Detection (ESAD) dataset: Challenges and methods

For an autonomous robotic system, monitoring surgeon actions and assisti...

Data Splits and Metrics for Method Benchmarking on Surgical Action Triplet Datasets

In addition to generating data and annotations, devising sensible data s...

Why Deep Surgical Models Fail?: Revisiting Surgical Action Triplet Recognition through the Lens of Robustness

Surgical action triplet recognition provides a better understanding of t...

Real-time image-based instrument classification for laparoscopic surgery

During laparoscopic surgery, context-aware assistance systems aim to all...

Please sign up or login with your details

Forgot password? Click here to reset