Abstract
The global healthcare sector faces a critical shortage of healthcare workers, posing a substantial threat to the sustainability of modern healthcare systems. Despite various socioeconomic measures, these efforts have seen limited success, prompting increased interest in automation to alleviate medical staff workloads. One promising approach is the development of robotic scrub nurses (RSNs), autonomous surgical assistants in charge of surgical instrument handling. Despite commendable efforts, significant challenges continue to hinder RSN implementation, including the need for large datasets of annotated images to train AI-based detectors, unreliable tool localization performance, and the absence of a versatile gripper that can securely handle various surgical instruments. This dissertation proposes solutions that address these key challenges to help enable RSNs to effectively perform instrument detection, localization, and grasping. The primary contributions of this work include: 1) a novel data augmentation method based on a limited number of manually annotated images to improve detection performance and generalization with minimal annotation effort, 2) a multi-view voting approach for improved tool localization by filtering out detection errors, and 3) the design of a hybrid gripper based on granular jamming technology, capable of securely grasping a wide range of instruments while promoting compatibility with human collaboration. Experimental results demonstrated the efficacy of the proposed solutions, showing high detection performance achieved through data augmentation, a significant reduction in localization errors with multi-view aggregation, and reliable performance of the hybrid gripper in handling diverse surgical instruments. These advancements represent a significant step forward in RSN development, offering the potential to enhance surgical efficiency and help mitigate the impact of the healthcare workforce shortage.
| Originalsprache | Deutsch |
|---|---|
| Qualifikation | Doktor-Ingenieur(in) (Dr.-Ing.) |
| Gradverleihende Hochschule |
|
| Betreuer*in / Berater*in |
|
| Datum der Bewilligung | 13 Jan. 2025 |
| DOIs | |
| Publikationsstatus | Veröffentlicht - 30 Jan. 2025 |
UN-Ziele für nachhaltige Entwicklung (SDGs)
2015 einigten sich die UN-Mitgliedstaaten auf 17 globale Ziele für nachhaltige Entwicklung (Sustainable Development Goals, SDGs) zur Beendigung von Armut, zum Schutz des Planeten und zur Förderung des allgemeinen Wohlstands. Hiermit leisten wir einen Beitrag zu folgendem/n Ziel(en) für nachhaltige Entwicklung (SDGs):
-
SDG 3 Gute Gesundheit und Wohlergehen
-
HybGrip: a synergistic hybrid gripper for enhanced robotic surgical instrument grasping
Badilla-Solórzano, J., Ihler, S. & Seel, T., Dez. 2024, in: International journal of computer assisted radiology and surgery. 19, 12, S. 2363-2370 8 S.Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
Open Access -
Modular, Label-Efficient Dataset Generation for Instrument Detection for Robotic Scrub Nurses
Badilla Solórzano, J. A., Gellrich, N. C., Seel, T. & Ihler, S., 27 Apr. 2024, Data Augmentation, Labelling, and Imperfections : Third MICCAI Workshop, DALI 2023, Held in Conjunction with MICCAI 2023, Vancouver, BC, Canada, October 12, 2023, Proceedings. Xue, Y., Chen, C., Chen, C., Zuo, L. & Liu, Y. (Hrsg.). Springer, S. 95-105 11 S. ( Lecture Notes in Computer Science ; Band 14379).Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
-
Improving instrument detection for a robotic scrub nurse using multi-view voting
Badilla-Solórzano, J., Ihler, S., Gellrich, N. C. & Spalthoff, S., Nov. 2023, in: International journal of computer assisted radiology and surgery. 18, 11, S. 1961-1968 8 S.Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
Open Access
Dieses zitieren
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver