Please use this identifier to cite or link to this item:
https://ir.swu.ac.th/jspui/handle/123456789/27487
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gobhiran A. | |
dc.contributor.author | Wongjunda D. | |
dc.contributor.author | Kiatsoontorn K. | |
dc.contributor.author | Charoenpong T. | |
dc.date.accessioned | 2022-12-14T03:17:28Z | - |
dc.date.available | 2022-12-14T03:17:28Z | - |
dc.date.issued | 2022 | |
dc.identifier.issn | 9296212 | |
dc.identifier.uri | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85115376626&doi=10.1007%2fs11277-021-09121-8&partnerID=40&md5=5cc0c564e541f0a7d465bf12134f18ba | |
dc.identifier.uri | https://ir.swu.ac.th/jspui/handle/123456789/27487 | - |
dc.description.abstract | Surgeons must intraoperatively view cross-section images under sterilization conditions. Keyboard and computer mouse are sources of contamination. A computer vision algorithm and a hand movement pattern analysis technique have been applied to solve the problem based on surgeon’s behaviors. This paper proposed a new method to control the radiological image viewer in an operating room. A pattern code of hand movement and a grid square guideline are used. Our proposed algorithm comprises three steps: hand tracking, pattern code area identification, and hand movement pattern recognition. First, the system is fed with a sequence of three-dimensional data. A 3D camera captures the whole target body. A skeleton tracking algorithm is used to detect the human body. The left-hand joint in the skeleton data set is tracked. Second, as this algorithm supports one hand movement, a grid square guideline is defined. Hand movements are interpreted from the hand path moving in the grid square area. Finally, the pattern code is defined as a feature vector. By using the feature vector and closest point classifier, the hand movements are recognized by the K-Nearest Neighbors algorithm. To test the performance of the proposed algorithm, data from twenty subjects were used. Seven commands were used to interface with the computer workstation to control the radiological image viewer. The accuracy rate was 95.72%. The repeatability was 1.88. The advantage of this method is that one hand can control the image viewer software from a distance of 1.5 m satisfactorily without contacting computer devices. Our method also does not need big data set to train the system. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature. | |
dc.language | en | |
dc.subject | Cameras | |
dc.subject | Mammals | |
dc.subject | Musculoskeletal system | |
dc.subject | Palmprint recognition | |
dc.subject | Sterilization (cleaning) | |
dc.subject | Condition | |
dc.subject | Cross-section images | |
dc.subject | Features vector | |
dc.subject | Hands movement | |
dc.subject | Human Machine Interface | |
dc.subject | Image viewer | |
dc.title | Hand Movement-Controlled Image Viewer in an Operating Room by Using Hand Movement Pattern Code | |
dc.type | Article | |
dc.rights.holder | Scopus | |
dc.identifier.bibliograpycitation | Wireless Personal Communications. Vol 123, No.1 (2022), p.103-121 | |
dc.identifier.doi | 10.1007/s11277-021-09121-8 | |
Appears in Collections: | Scopus 2022 |
Files in This Item:
There are no files associated with this item.
Items in SWU repository are protected by copyright, with all rights reserved, unless otherwise indicated.