Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/11531/92979
Registro completo de metadatos
Campo DC Valor Lengua/Idioma
dc.contributor.authorSorbet Santiago, Sofíaes-ES
dc.contributor.authorCifuentes Quintero, Jenny Alexandraes-ES
dc.date.accessioned2024-09-03T08:48:34Z-
dc.date.available2024-09-03T08:48:34Z-
dc.date.issued2024-09-01es_ES
dc.identifier.issn0266-4720es_ES
dc.identifier.urihttps://doi.org/10.1111/exsy.13706es_ES
dc.identifier.urihttp://hdl.handle.net/11531/92979-
dc.descriptionArtículos en revistases_ES
dc.description.abstract.es-ES
dc.description.abstractHand gesture recognition and classification play a pivotal role in automating Human-Computer Interaction (HCI) and have garnered substantial attention in research. In this study, the focus is placed on the application of gesture recognition in surgical settings to provide valuable feedback during medical training. A tool gesture classification system based on Deep Learning (DL) techniques is proposed, specifically employing a Long Short Term Memory (LSTM)-based model with an attention mechanism. The research is structured in three key stages: data pre-processing to eliminate outliers and smooth trajectories, addressing noise from surgical instrument data acquisition; data augmentation to overcome data scarcity by generating new trajectories through controlled spatial transformations; and the implementation and evaluation of the DL-based classification strategy. The dataset used includes recordings from ten participants with varying surgical experience, covering three types of trajectories and involving both right and left arms. The proposed classifier, combined with the data augmentation strategy, is assessed for its effectiveness in classifying all acquired gestures. The performance of the proposed model is evaluated against other DL-based methodologies commonly employed in surgical gesture classification. The results indicate that the proposed approach outperforms these benchmark methods, achieving higher classification accuracy and robustness in distinguishing diverse surgical gestures.en-GB
dc.format.mimetypeapplication/pdfes_ES
dc.language.isoen-GBes_ES
dc.rightses_ES
dc.rights.uries_ES
dc.sourceRevista: Expert Systems, Periodo: 1, Volumen: Online first, Número: Online first, Página inicial: 1, Página final: 21es_ES
dc.titleDeep learning-based gesture recognition for surgical applications: A data augmentation approaches_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.description.versioninfo:eu-repo/semantics/publishedVersiones_ES
dc.rights.holderPolítica editoriales_ES
dc.rights.accessRightsinfo:eu-repo/semantics/restrictedAccesses_ES
dc.keywords.es-ES
dc.keywordsattention-based LSTM neural networks, data augmentation, deep learning, gesture classification, surgical gesturesen-GB
Aparece en las colecciones: Artículos

Ficheros en este ítem:
Fichero Tamaño Formato  
20249285351860_Expert Systems - 2024 - Santiago - .pdf3,63 MBAdobe PDFVisualizar/Abrir     Request a copy


Los ítems de DSpace están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.