JIGSAWS: The JHU-ISI Gesture and Skill Assessment Working Set

JHU-ISI Gesture and Skill Assessment Working Set

(JIGSAWS)

suktnp

Form left to right: Suturing, Knot-Tying, Needle-Passing.

Announcement

1/4/2017  Our paper reporting the comparative results of 8 techniques using the JIGSAWS dataset is ready as preprint now on TBME. For citation see below.

3/25/2016 Sample videos of the JIGSAWS tasks can be downloaded here: Suturing, Knot-Tying, Needle-Passing.

2/6/2016 We are facing a technical glitch with the download script. Please email us ([email protected]) asking for an alternative download link. Thanks for understanding.

3/24/2015 Download packages updated with full GRS!

Introduction

The JHU-ISI Gesture and Skill Assessment Working Set (JIGSAWS) is a surgical activity dataset for human motion modeling. The data was collected through a collaboration between The Johns Hopkins University (JHU) and Intuitive Surgical, Inc. (Sunnyvale, CA. ISI) within an IRB-approved study. The release of this dataset has been approved by the Johns Hopkins University IRB.   The dataset was captured using the da Vinci Surgical System from eight surgeons with different levels of skill performing five repetitions of three elementary surgical tasks on a bench-top model:  suturing, knot-tying and needle-passing , which are standard components of most surgical skills training curricula. The JIGSAWS dataset consists of three components:

  • kinematic data: Cartesian positions, orientations, velocities, angular velocities and gripper angle describing the motion of the manipulators.
  • video data: stereo video captured from the endoscopic camera. Sample videos of the JIGSAWS tasks can be downloaded here: Suturing, Knot-Tying, Needle-Passing.
  • manual annotations including:
    • gesture (atomic surgical activity segment labels).
    • skill (global rating score using modified objective structured assessments of technical skills).
  • experimental setup: a standardized cross-validation experimental setup that can be used to evaluate automatic surgical gesture recognition and skill assessment methods.  Our group have been using this experimental setup to evaluate and compare the methods we developed.

See this paper for more details on the dataset release. If you use JIGSAWS for any analyses submitted for publication then we ask that you cite the following paper(s) in your manuscript.

Acknowledgements

Credits to Carol Reiley and Balazs Vagvolgyi for collecting data and developing the data-collection software.

Special acknowledgement to Intuitive Surgical Inc. for their generosity in contributing the data to the community.

Downloads

** This dataset is released for the purpose of academic research, not for commercial usage. **

Please complete the this form to access the data set and utilities.

Citations

 If you use this database, or refer to its results, please cite the following papers:

  • [For Dataset Descriptions]
    Yixin Gao, S. Swaroop Vedula, Carol E. Reiley, Narges Ahmidi, Balakrishnan Varadarajan, Henry C. Lin, Lingling Tao, Luca Zappella, Benjam ́ın B ́ejar, David D. Yuh, Chi Chiung Grace Chen, Ren ́e Vidal, Sanjeev Khudanpur and Gregory D. Hager, The JHU-ISI Gesture and Skill Assessment Working Set (JIGSAWS): A Surgical Activity Dataset for Human Motion Modeling, In Modeling and Monitoring of Computer Assisted Interventions (M2CAI) – MICCAI Workshop, 2014. [bibtex] [pdf]
  • [For Comparative Results] 
    Narges Ahmidi, Lingling Tao, Shahin Sefati, Yixin Gao, Colin Lea, Benjamın Bejar Haro, Luca Zappella, Sanjeev Khudanpur, Rene Vidal, Fellow, IEEE, Gregory D. Hager, A Dataset and Benchmarks for Segmentation and Recognition of Gestures in Robotic Surgery,  Transaction of Biomedical Engineering, 2017. [preprint]

Publications using the JIGSAWS Data Set

Please visit the google scholar link to view publications that have used the JIGSAWS data set.

Feedback

Please let us know your questions, comments and suggestions via [email protected]. We appreciate your feedback!

Links and resources

Comments are closed.