Please use this identifier to cite or link to this item: https://ir.swu.ac.th/jspui/handle/123456789/27487
Title: Hand Movement-Controlled Image Viewer in an Operating Room by Using Hand Movement Pattern Code
Authors: Gobhiran A.
Wongjunda D.
Kiatsoontorn K.
Charoenpong T.
Keywords: Cameras
Mammals
Musculoskeletal system
Palmprint recognition
Sterilization (cleaning)
Condition
Cross-section images
Features vector
Hands movement
Human Machine Interface
Image viewer
Issue Date: 2022
Abstract: Surgeons must intraoperatively view cross-section images under sterilization conditions. Keyboard and computer mouse are sources of contamination. A computer vision algorithm and a hand movement pattern analysis technique have been applied to solve the problem based on surgeon’s behaviors. This paper proposed a new method to control the radiological image viewer in an operating room. A pattern code of hand movement and a grid square guideline are used. Our proposed algorithm comprises three steps: hand tracking, pattern code area identification, and hand movement pattern recognition. First, the system is fed with a sequence of three-dimensional data. A 3D camera captures the whole target body. A skeleton tracking algorithm is used to detect the human body. The left-hand joint in the skeleton data set is tracked. Second, as this algorithm supports one hand movement, a grid square guideline is defined. Hand movements are interpreted from the hand path moving in the grid square area. Finally, the pattern code is defined as a feature vector. By using the feature vector and closest point classifier, the hand movements are recognized by the K-Nearest Neighbors algorithm. To test the performance of the proposed algorithm, data from twenty subjects were used. Seven commands were used to interface with the computer workstation to control the radiological image viewer. The accuracy rate was 95.72%. The repeatability was 1.88. The advantage of this method is that one hand can control the image viewer software from a distance of 1.5 m satisfactorily without contacting computer devices. Our method also does not need big data set to train the system. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
URI: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85115376626&doi=10.1007%2fs11277-021-09121-8&partnerID=40&md5=5cc0c564e541f0a7d465bf12134f18ba
https://ir.swu.ac.th/jspui/handle/123456789/27487
ISSN: 9296212
Appears in Collections:Scopus 2022

Files in This Item:
There are no files associated with this item.


Items in SWU repository are protected by copyright, with all rights reserved, unless otherwise indicated.