Please use this identifier to cite or link to this item:
https://ir.swu.ac.th/jspui/handle/123456789/12437
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Phothisonothai M. | |
dc.contributor.author | Tantisatirapong S. | |
dc.date.accessioned | 2021-04-05T03:03:25Z | - |
dc.date.available | 2021-04-05T03:03:25Z | - |
dc.date.issued | 2019 | |
dc.identifier.other | 2-s2.0-85065102320 | |
dc.identifier.uri | https://ir.swu.ac.th/jspui/handle/123456789/12437 | - |
dc.identifier.uri | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85065102320&doi=10.1109%2fKST.2019.8687804&partnerID=40&md5=4f6bf8f3f8f1c153152cadf7869cdbc7 | |
dc.description.abstract | Human-Machine Interaction (HMI) requires a multidisciplinary research study mainly focused on interaction modalities between humans and machines. In this paper, we introduced an integrated HMI system using electrical brainwave signal (or called Electroencephalography: EEG) and eye tracking of pupil movement (or called ET), whose are used as an alternative channel to communicate with others for people with disabilities. In this experiment, the target and non-Target visual stimuli of EEG-based HMI system on the basis of event-related potential (ERP) and steady state visually evoked potential (SSVEP) signals have been performed. For the ET based framework, we proposed the user-friendly virtual keyboard typing for Thai language, i.e., free-form and automatic typing modes. The results showed that the integrated HMI using ERP-SSVEP yielded an average accuracy of 97.4% and reaction time approximately was 724.2 millisecond for control commands. The automatic typing mode performed an average accuracy of 97%, with an average printing time of 6.17 seconds per word for ET based virtual Thai keyboard. © 2019 IEEE. | |
dc.subject | Electroencephalography | |
dc.subject | Electrophysiology | |
dc.subject | Enterprise resource planning | |
dc.subject | Eye movements | |
dc.subject | Human computer interaction | |
dc.subject | Human reaction time | |
dc.subject | Man machine systems | |
dc.subject | EEG signals | |
dc.subject | Event-related potentials | |
dc.subject | Human machine interaction | |
dc.subject | Human machine interaction system | |
dc.subject | Interactive system | |
dc.subject | Multi-disciplinary research | |
dc.subject | SSVEP | |
dc.subject | Steady state visually evoked potentials | |
dc.subject | Eye tracking | |
dc.title | Integrated Human-Machine Interaction System: ERP-SSVEP and Eye Tracking Based Technologies | |
dc.type | Conference Paper | |
dc.rights.holder | Scopus | |
dc.identifier.bibliograpycitation | 2019 11th International Conference on Knowledge and Smart Technology, KST 2019. (2019), p.244-248 | |
dc.identifier.doi | 10.1109/KST.2019.8687804 | |
Appears in Collections: | Scopus 1983-2021 |
Files in This Item:
There are no files associated with this item.
Items in SWU repository are protected by copyright, with all rights reserved, unless otherwise indicated.