Development of Robust Electrooculography (EOG) Based Human Computer Interface
Radwa Reda Hossieny Mohamed Ahmed;
Abstract
Recently, the increase in the number of patients with motor disabilities has become a noticeable phenomenon all over the world. The reasons for this increase are due to the emergence of many diseases that cause motor nerves to atrophy and thus prevent the motor limbs from performing their vital role. This injury extends to all parts of the body and causes complete paralysis and only the neurons that control eye movement survive. Hence, patients do not have a way to communicate with their surrounding environment except through the movement of their eyes.
The human-computer interface (HCI) has emerged and become a new communication way and support tool for these patients. It allows a communication between the user and the computer that depends on the analysis of voluntary, controlled bio-signals to choose a specific action, execute, and display it on the computer screen. HCI systems are based on determining eye movement directions from Electrooculogram.
An electrooculogram (EOG) records eye movement as signals produced from variation in the polarity of the nerve of the eye. EOG recording is performed by a set of electrodes placed horizontally and vertically on the controlling muscles of the eye. The relationship between the electrooculography signals and eye movement is linear. The waveform of the electrooculography signal is completely in line with the eye movement, so it is easy to analyze and identify.
This thesis proposes a HCI writing system based on classifying EOG signals by a proposed deep learning model. This system helps all patients with diseases that cause severe motor disability and paralysis in all their limbs. In addition, it provides them with a new way of communicating with their external environment without always needing a companion. The proposed system detects six different directions of eye movement: up, down, right, left, center, and blinking, in addition, using them to select letters, write messages from a virtual keyboard, and vocalize them as well.
The vertical and horizontal EOG signals are filtered from noise using a second-order band-pass filter. Two different approaches have been considered to classify the signals. The first approach depends on extracting the statistical and morphological features from the filtered signals and concatenating them in a final feature vector that represents an entry for six machine learning classifiers. The six classifiers are Linear Discriminant Analysis (LDA), Support Vector Machines (SVM), Multinomial Logistic Regression (MLR), K Nearest Neighbor (KNN), Decision Trees and Naïve Bayes (NB). The second approach relies on concatenating the horizontal and vertical filtered EOG signals into a vector as input to five deep learning models: Convolutional Neural Network (CNN), VGG Network, Inception Network, Residual Network, and ResNet-50 Network. Experiments have been conducted on two datasets: public small dataset and PSL-IEOG2 dataset which is a large dataset collected by us using the PSL-IEOG2 device dedicated to measure eye signals.
The human-computer interface (HCI) has emerged and become a new communication way and support tool for these patients. It allows a communication between the user and the computer that depends on the analysis of voluntary, controlled bio-signals to choose a specific action, execute, and display it on the computer screen. HCI systems are based on determining eye movement directions from Electrooculogram.
An electrooculogram (EOG) records eye movement as signals produced from variation in the polarity of the nerve of the eye. EOG recording is performed by a set of electrodes placed horizontally and vertically on the controlling muscles of the eye. The relationship between the electrooculography signals and eye movement is linear. The waveform of the electrooculography signal is completely in line with the eye movement, so it is easy to analyze and identify.
This thesis proposes a HCI writing system based on classifying EOG signals by a proposed deep learning model. This system helps all patients with diseases that cause severe motor disability and paralysis in all their limbs. In addition, it provides them with a new way of communicating with their external environment without always needing a companion. The proposed system detects six different directions of eye movement: up, down, right, left, center, and blinking, in addition, using them to select letters, write messages from a virtual keyboard, and vocalize them as well.
The vertical and horizontal EOG signals are filtered from noise using a second-order band-pass filter. Two different approaches have been considered to classify the signals. The first approach depends on extracting the statistical and morphological features from the filtered signals and concatenating them in a final feature vector that represents an entry for six machine learning classifiers. The six classifiers are Linear Discriminant Analysis (LDA), Support Vector Machines (SVM), Multinomial Logistic Regression (MLR), K Nearest Neighbor (KNN), Decision Trees and Naïve Bayes (NB). The second approach relies on concatenating the horizontal and vertical filtered EOG signals into a vector as input to five deep learning models: Convolutional Neural Network (CNN), VGG Network, Inception Network, Residual Network, and ResNet-50 Network. Experiments have been conducted on two datasets: public small dataset and PSL-IEOG2 dataset which is a large dataset collected by us using the PSL-IEOG2 device dedicated to measure eye signals.
Other data
| Title | Development of Robust Electrooculography (EOG) Based Human Computer Interface | Other Titles | تصميم نظام تواصل بين الانسان و الحاسب يعتمد علي اشارات العين (EOG) | Authors | Radwa Reda Hossieny Mohamed Ahmed | Issue Date | 2022 |
Attached Files
| File | Size | Format | |
|---|---|---|---|
| BB12910.pdf | 778.82 kB | Adobe PDF | View/Open |
Similar Items from Core Recommender Database
Items in Ain Shams Scholar are protected by copyright, with all rights reserved, unless otherwise indicated.