EEG Based Automated Detection of Six Different Eye Movement Conditions for Implementation in Personal Assistive Application

Different forms of human expressions are now being extensively used in present-day human–machine interfaces to provide assistive support to the elderly and disabled population. Depending on the subject condition, expressions conveyed in terms of eye movements are often found to provide the most efficient way of communication. Nowadays, standard Electroencephalogram (EEG) based arrangements, normally used to analyze neurological states are also being adopted for the detection of eye movements. Although, EEG-based recent state-of-the-arts researches are lagging as the majority of the works either detects eye movements in a lesser direction or uses a higher feature dimension with limited classification accuracy. In this study, a robust, simple and automated algorithm is proposed that exploits the analysis of the EEG signal to classify eye movements in six different directions. The algorithm uses discrete wavelet transformation to denoise the EEG signals acquired from six different leads. Then, from the reconstructed wavelet coefficients of each lead, two features are extracted and combined to form a binary feature map. The obtained binary feature map itself facilitates distinct visual classification of the eye movements. Finally, a unique value generated from the calculated weighted sum of the binary map is used to classify six different types of eye movements via a threshold-based classification technique. The algorithm presents high average accuracy, sensitivity, specificity of 95.85%, 95.83% and 95.83% respectively, using a single value only. Compared to other state-of-the-art methods, the adopted unique binarization methodology and the obtained results indicate the immense potential of the proposed algorithm to be implemented in personal assistive applications.


Introduction
In recent days, the growing rate of the elderly population and the dire influence of a variety of chronic diseases have been considered as a leading cause of increasing disability in our society [1]. To combat the disability-related outcome on daily basis, technological intervention in the healthcare sector has now become inevitable [2]. Recent reports of the World Health Organisation (WHO) categorically confirm that by 2030, over two billion people around the world will need at least one assistive tool [3]. Consequently, over the past few years, extensive scientific attention is given to the development of different forms of human-machine interfaces (HMIs) [4]. Owing to advanced signal processing techniques, state-of-the-art HMIs are now coming up with the capability of handling human-defined instructions, conveyed by means of various human expressions and standard bio-signals [5]. However, out of many expressions, eye movement of the patient is found to provide the most feasible way of communication with the machines especially while dealing with disorders related to acute muscle movements [6]. Moreover, the analysis of eye movement is found to be useful for blind individuals who intend to communicate or it could also be used by healthy individuals to minimize visual fatigue.
Apart from its important implications in disability management, a recent analysis of eye movement has gained plausible attention because of its potential of conveying neurological and psychological intentions during communication. Moreover, consistent movement of the human eyes also facilitates the gathering of reliable information from the exterior world as well, which is ultimately processed by the human brain [7,8]. Considering the promising prospects of the eye movements, the related researches are now being extended to a multitude of domains which include the examination of the depth of visual attention, assessment of human cognitive condition, the intelligent interface of eye gazing in HMIs etc. [9][10][11].
Normally, the analysis of Electrooculography (EOG) signal, obtained from the existing corneal-retinal potential difference of the eye has been used as a gold standard technique for the detection of eye movements [12,13]. Despite of its operational simplicity, the use of EOG signal often imposes challenging limitations as the set-up of EOG electrodes around the close vicinity of the eyes might cause discomfort, the artefacts originated owing to the facial expressions of the subject often induces signal corruption and finally, the overall EOG acquisition arrangement also reduces the field of vision of the subject [14].
To overcome the shortcomings related to the use of EOG, different characteristics of the Electroencephalography (EEG) signals are now being adopted by researchers as an alternative for the detection of eye movements. Over the past few decades, EEG has been a popular non-invasive technique of quantifying the electrical response of the brain, obtained by placing electrodes on the exterior of the human scalp [15]. Compared to the existing EOG set-ups, recent state-of-the-art EEG devices also offer wireless acquisition of the signal without compromising patient comfort [16]. Primarily, the EEG signal is used to explore a multitude of neurological and psychological activities of the brain. Apart from other responses originated from the human brain, EEG electrodes placed on the frontal lobe of the scalp also pick up information related to eye movement, which appears in the form of EOG artefacts [17,18]. The presence of EOG artefacts in the EEG signal makes it a potential tool for the analysis of eye movements [18].
So far, only a handful of investigations have been carried out to exploit the embedded information of the EEG signal for the detection of eye movements. Maneuvering a mechanical wheelchair system via the inherent EOG artefacts of the EEG signal is proposed in [4].
The reported methodology evaluates the extracted signal features via the support vector machine (SVM) classification technique in order to establish the position and direction of the eyelid and the gaze. In [19], following the changes in the eye condition, as obtained from the EEG signal, the detected changes are then analyzed via Empirical Mode Decomposition (EMD) for feature extraction and then standard classifiers are used to substantiate the possible changes in the eye condition with an accuracy of 88.2%. In another recent study, combined features extracted from the analysis of EEG and eye-tracking signal data have been utilized with the support vector machine (SVM) to identify children with autistic disorders [20]. The methodology proposed in [21] uses EOG artefacts in the EEG signal to classify eye movement in four cardinal directions only. The algorithm employs the area under the curve of the denoised EEG signal as a feature to classify the eye movements by means of a threshold-based hierarchical classification method with a limited average accuracy of 50-85%. An automated algorithm is proposed in [22], which uses the EEG signal to facilitate the real-time analysis of six different classes of eye movements. However, the algorithm uses four different features derived from the processed EEG signal to classify eye movements using a threshold-based linear classifier with an average accuracy of 85.2%. Robust and real-time control of video games via EEG-based eye movement is proposed in [23]. The technique uses the same acquisition tool and follows the same feature extraction and classification methodology as discussed in [22]. Finally, the algorithm shows an average accuracy of 80.2% while evaluating with five participants only.
Evidently, a majority of the above-mentioned methods either work efficiently with detected eye movements in a lesser number of direction [21,24], uses complicated feature extraction and classification techniques [4,19,20] or uses a higher feature dimension [20,21,23] with limited classification accuracy. As a potential alternative, a simple, robust EEG-based algorithm is proposed in this work for automated identification of six different classes of eye movements with significant accuracy. The algorithm initially uses discrete wavelet transformation (DWT) of the acquired EEG signal, obtained from six different leads to eliminate a wide range of noise and artefacts. Then, related to the eye movements, two robust features per lead are extracted from the optimized wavelet coefficients. The obtained features are converted to a single binary feature map, which in turn facilitates the visual classification of eye movements. Finally, a weighted sum of the binary feature map is computed and used as a single value to classify six different types of eye movements using a simple threshold-based technique. The method essentially requires no decision boundary-based classification techniques. Rather, detection of the eye movements in different directions can be carried out using a single value only.
The remaining of the article is structured as follows. A complete explanation regarding the proposed methodology is discussed in Sect. 2. Information related to the adopted database, experimental outcomes and the evaluation results of the proposed algorithm with other state-of-the-art works is presented in Sect. 3. A detailed discussion related to the efficiency and robustness of the algorithm is carried out in Sect. 4. Finally, a fair conclusion about the prospect of the proposed technique is outlined in Sect. 5.

Methodology
The entire methodology followed in the proposed EEG-based eye movement analysis is summarized in Fig. 1. The algorithm mainly encompasses four key parts: (1) pre-processing of the acquired EEG signal via wavelet transformation and elimination of different types of noises and artefacts; (2) extraction of the eye movement-related features from the wavelet coefficients; (3) binarization of the obtained feature set and creation of binary feature map; (4) computation of the equivalent-weighted sum of the binary feature map and threshold-based classification of the eye movements.

Acquisition of the EEG Signal Data
In the present research work, non-invasive acquisition of EEG signals with eye movement in different directions are carried out at the Department of Applied Physics, University of Calcutta, by placing an EEG cap over the scalp of each subject. The positions of the EEG leads on the cap are maintained following the international protocol of the 10-20 electrode placement system. The entire process of EEG signal acquisition is accomplished by means of the BIOPAC MP 150 data acquisition system with sixteen separate configurable leads [24]. Since, the objective of this work is to investigate the movement of the eye; only six eye-specific leads have been chosen to record the EEG signals at a sampling frequency of 1 kHz. The chosen leads, 'Fp1-Fp2', 'F3-F4', 'C3-C4', 'P3-P4', 'F7-F8' and 'T3-T4', located symmetrically on both sides of the line joining the 'Nasion' and the 'Inion' over the scalp are respectively marked as Lead 1, Lead 2, Lead 3, Lead 4, Lead 5 and Lead 6 respectively. After positioning all the leads, EEG signal has been collected from twenty healthy volunteers (15 male and 5 female) with age ranging from 22 to 35. Out of twenty participants, sixteen of them had normal vision and the remaining four participants had corrected-to-normal vision. A representative EEG signal record obtained from a subject for six different leads is shown in Fig. 2.
Before the start of the recording, the entire experimental procedure was explained to each of the volunteers and they were asked to stay in relaxed or in idle condition with their eyes closed for about 2 min. After the resting period, the recording of the EEG signal is started and carried out for another 2 min with the eye closed condition. The subjects were then asked to open their eyes and look straight towards a marked point on the wall without any head movement. The point is kept at the same horizontal level as their eye and the recording of the EEG signal is done for 2 min. The subjects were then asked to focus on the next position at the same horizontal level at about an angle of 30° to the left and to the right of the central (first) mark respectively. For each of the mentioned cases, the EEG signal is recorded for two minutes. Similar recordings were also carried out at about 30° above and below the central mark position. The entire procedure of EEG data collection Fig. 1 Block diagram of the proposed methodology for the classification of six different types of eye movements is presented. After removal of the power line components, a Discrete Wavelet Transform (DWT) based methodology is adopted to eliminate a wide range of noises and artefacts of the EEG signal acquired from six different leads. For every lead, two features are extracted from the chosen wavelet coefficients and are unified to form a binary feature map. Then, a weighted sum of the unified binary map is computed and used as a single unique feature to classify six different types of eye movements via a simple threshold-based classification technique with different eye positions is repeated for each of the subjects and is illustrated in Fig. 3. Before the experimental procedure, a signed informed consent form was collected from each of the participants as per the institutional and the international protocol.

Pre-processing of the Acquired EEG Data
The EEG signal consists of five major sub-bands (delta, theta, alpha, beta and gamma) of specific frequency ranges as shown in Table 1. Apart from the mentioned five frequency bands, the acquired EEG signal often gets contaminated by some additional noisy frequency components such as power-line, baseline and eye blink artefact, respectively. These  The experimental arrangement to investigate the process of eye movements. During the experiment, each of the subjects was instructed to follow the circular marked positions sequentially by moving their eyes at a specific visual angle. The movement of the eye in each direction was essentially carried out based on an external auditory command unwanted noisy components need to be eliminated for better analysis of the EEG signal and for the extraction of the respective sub-bands.

Removal of the Power-Line Artefact
Power line interferences with a frequency component of 50 Hz have been a major source of noise in the EEG signal. In the present research, the moving average filtering technique is adopted to eliminate the power line interferences from the acquired EEG signal obtained from all the leads. As the acquired EEG signals were sampled at a frequency of f S = 1 kHz, then the value of the estimated window size will be where f is used to represent the power line frequency. Now, following Eq. (1), a twenty-point moving averaging method is carried out and the outcome of a representative EEG signal is illustrated in Fig. 4.

Removal of High Frequency Components
Further processing of the EEG signal is carried out using a discrete wavelet transformation (DWT) based technique. DWT has been regarded as the most efficient tool to facilitate the analysis of any non-stationary signal by decomposing it into consecutive frequency subbands. A single-stage DWT decomposes the entire signal frequency range into two separate frequency bands using a low pass and a high pass filtering technique. The signal component obtained after low-pass filtering is known as the "Approximation coefficient" (cA) and the component obtained after high-pass filtering is identified as the "Detail coefficient" (cD). However, the efficiency of the adopted DWT method depends on the appropriate selection of the mother wavelet and the proper number of decomposition levels. Inherently, the variation of the EEG signal is found to be similar to the Daubechies wavelet family 'db4'. Hence in this work, 'db4' is chosen as the 'mother wavelet' to carry out DWT analysis of the EEG signal [25]. Since the sampling frequency of the recorded EEG signal is 1 kHz and the lowest frequency range of the EEG sub-band (Delta wave) is 0.5-4 Hz, tenlevel decomposition of the EEG signal is carried out in this work to separate the required sub-bands from the original signal.
As the EEG signal is sampled at 1 kHz, then following the Nyquist criteria, the maximum frequency component present in the recorded EEG signal is 500 Hz. Now, a singlestage DWT generates a Detail coefficient (cD1) and an Approximation coefficient (cA1) with frequency components ranging from 250-500 and 0-250 Hz respectively. At the second stage, cA1 is further decomposed into cD2 and cA2 having a frequency range of 125-250 Hz and 0-125 Hz. The coefficient cA2 is further decomposed which generates cD3 (62.5-125 Hz) and cA3 (0-62.5 Hz). After all these stages, a few of the detail coefficients are found to contain frequency components beyond the ranges of the EEG subbands. These components are then eliminated in the present work as high-frequency noise components.

Removal of Baseline Wandering Artefact
Normally, the frequency components related to the baseline wandering are found to be lower than 0.5 Hz. Hence, the approximation coefficient obtained at the tenth level decomposition (cA10) of DWT is considered as the baseline component and is eliminated. A representative EEG 'delta' wave with and without baseline artefact is presented in Fig. 5.

Selection of the Required DWT Coefficients
After removal of the noise and artefacts, only those coefficients with frequency ranges matching with the theoretical values of the EEG frequency band are chosen for further analysis. More specifically, after ten-level decomposition it is found that the coefficients cD10, cD9 and cD8 corresponds to the 'Delta' wave, cD7 corresponds to the 'Theta' wave, cD6 corresponds to the 'Alpha' wave, cD5 corresponds to the 'Beta' wave and cD4 corresponds to the 'Gamma' wave respectively. These seven coefficients are then chosen for

Removal of Eye Blink Artefact
The presence of eye blink artefact in the EEG signal can be identified from its low frequency and high amplitude characteristic. The signal corresponding to each of the acquired EEG leads contains several eye blinking instants. All these eye blink signals are detected following a less complicated slope detection method and are marked for visual identification purposes. The slope detection method is carried out by calculating the rate of change of the signal amplitude at every instant. Now, the regions where there is an abrupt change in the slope values beyond a pre-defined threshold are identified as the eye blink instants. This unique identification process is repeated for each of the records under consideration. A representative eye blink artefact obtained from a particular record is shown in Fig. 7.
After inspecting the entire database, it is found that the minimum duration of an eye blink signal is near about one second. Accordingly, a sliding window is then chosen to remove the eye blink artefacts. Now, as the window moves from the commencement of the eye blink instant, each of the signal values is replaced by calculating the average value of  the signal, one second earlier and one second after the given sampling instant. The process of data substitution is carried out throughout the entire interval of the eye blink. After that, the sliding window is moved from the ending instant of the present eye blink to the starting instant of the next eye blink and the same procedure is repeated. Different EEG signal bands before and after the removal of the eye blink artefact is shown in Fig. 8. The resulting EEG signal is then used for feature extraction.

Feature Extraction
After denoising, power analysis of the wavelet coefficients corresponding to each of the bands are carried out to extract the information related to eye movement. Towards this purpose, a sliding window of two seconds is considered to calculate the signal power for the coefficient of a specific band. The window is then shifted sequentially from the start to the end of the data and a power value is computed for each of the windows. Finally, a summation of the power values, as calculated following Eq. 2 is used to represent the overall power corresponding to a specific band. A similar power array is generated for all the bands, corresponding to each of the leads and for all positions of the eye as indicated in Fig. 9.
where i = 1, 2, 3, 4, … , n − 2 and n = length of the coefficient value, α = power of Alpha band and l = lead number. These variations in the power values are used to derive features to identify the direction of eye movement. Two features namely, the absolute power factor (APF) and the weighted average power (WAP) is extracted from these calculated power values. For a given lead, APF is defined as the ratio of the power of a particular band to the total power of all the bands. Another feature WAP for a given EEG band is defined as the ratio of the total power of all the leads to the number of leads. Mathematically, APF for a given band alpha (α) of a particular lead and WAP for a given EEG band (δ) can be formulated as follows.
where P , P , P , P , P denotes the power of the delta, theta, alpha, beta and gamma bands respectively for a given channel. The number of channels is represented by n. The calculated features as proposed in the above Eqs. (3) and (4) are shown in Table 2 after being rounded up to two decimal places for better representation.

'Binarization' of the Parameters and Creation of a Binary Feature Map
It is evident from Table 2 that the dimensions of the extracted features APF and WAP for every eye condition are (6 × 5) and (1 × 5) respectively. Therefore, corresponding to each eye condition, a total number of (6 × 5 + 5 × 1 = 30 + 5 = 35) thirty-five feature In case of APF, after investigating the values obtained from all the leads and all the records under consideration, an empirical threshold value for each of the leads is computed. The lead-specific threshold values are then applied to all the individual APF values to initiate the process of 'binarization'. In the process of 'binarization', if a particular feature value is found to be greater than the chosen threshold then it is replaced by '1', otherwise, the value is replaced by a '0'. For all the leads and for each eye condition, the entire process is repeated to binarize all the APF values. In the case of WAP, the empirical threshold values are computed based on the obtained values from each of the bands of all the EEG records. Then, the 'binarization' of the WAP feature values is also carried out in a similar way as that was used in the case of APF. Now, owing to the dimensional mismatch of WAP and APF, the binarized WAP row vector is then appended as a seventh-row vector with the binarized APF matrix to form a single binary feature map of dimension (7 × 5) as illustrated in Fig. 10. The entire process is repeated for all the records and for every eye condition. Such creation of a single, unique binary feature map not only helps to reduce the operational complexity of handling different features but also minimizes the computational burden in the classification stage.
The outcome of binarization for every eye condition is presented in Table 3 for six representative subjects. In Table 3, all the cells with '1' values are shown in black and the remaining cells with '0' values are painted as gray. Evidently, Table 3 shows that for every eye condition, the resulting binary map presents a unique discriminating combination irrespective of the subjects. The result is indicative of the fact that prior to using any classification logic; the derived binary feature map itself provides easy discrimination of every eye condition just by visual inspection. Now, following the position of the elements in the binary feature map, a positional weight value equal to the corresponding cell number is assigned to each of the elements.  Table 4.
The content of each cell of the binary feature map is then multiplied with the corresponding positional weight and the resultant summation of these values generates a unique number. This unique number corresponding to each position of the eye is termed as Binary Weighted Feature Value (BWFV) and is presented in Table 5. In this work, the unique Table 3 The binary feature map of six representative subjects for six different eye movement conditions values of BWFV are finally utilized to classify the movements of the eye in different directions.

Parameters Used for Evaluation
The efficiency of the proposed algorithm is presented in terms of three statistical parameters such as Sensitivity (Se), Specificity (Sp) and Accuracy (Acc), respectively. Each of the parameters is formulated as follows.
Here, TP is the number of positive classes correctly identified as positive, TN is the number of negative classes that are correctly identified as negative, FP is the number of negative classes erroneously identified as positive and FN is the number of positive classes mistakenly identified as negative. Moreover, Sensitivity (Se) signifies the ability of a test to identify the positive cases and Specificity (Sp) quantifies the rate of correctly identified true negatives cases. The proposed binary weighted feature values present discriminating separation for each class of eye movements. Hence, the application of a simple threshold-based binary classification strategy has been found to be sufficient for the classification of eye movements. However, considering the size of the adopted dataset, a ten-fold crossvalidation method is adopted in this work for rigorous validation of the classification method. For this purpose, all the twenty subjects are initially divided into ten equal parts, each part containing two subjects. Now, corresponding to every iteration, one part (two subjects) is eliminated and evaluation of the classifier is carried out on the remaining nine parts (eighteen subjects). The process is repeated ten times and the resultant average outcome for every fold is shown in Fig. 11 as well as listed in Table 6. The overall outcome as presented in Table 6 shows a high average accuracy (Acc) of 95.85%, sensitivity (Se) of 95.83% and specificity (Sp) of 95.83% respectively.

Performance Comparison
It is to be mentioned that, to date only a few types of research have followed the analysis of the EEG signal to address the problem of eye movement detection. In this work, the proposed methodology utilizes a unique binarization and feature mapping properties to easily classify the eye movements using a single value. The utilization of this unique value on a simple threshold-based classification strategy presents significant accuracy. In order to justify the utility of the proposed algorithm in terms of its performance and compatibility, the obtained results are evaluated with other state-of-the-art literature as well. Although, the use of different EEG acquisition set-up, variation in the size of the acquired EEG dataset, methodological dissimilarities, differences in the used feature dimension and also the adopted classification strategy creates an imbalance to carry out a faithful evaluation with other researches. The detailed assessment of the proposed algorithm in terms of its performance with other state-of-the-art literature is presented in Table 7. Clearly, the descriptions incorporated in Table 7 are indicative of the fact that compared to other related literature; the proposed algorithm shows high efficiency in terms of its accuracy and robustness. Fig. 11 The overall average a accuracy, b sensitivity and c specificity performance of the classifier for each fold of the classification algorithm is presented in terms of the bar plots

Discussion
Contemporary researches exhibit the fact that the underlying attributes extracted from the sequence of the eye movements can also be used to administer the operation of a wide range of assistive applications. However, owing to the complicacy involved in the traditional EOG based methods, the analysis of the EEG signals is now being considered as an alternative for the detection of eye movement in different directions [18][19][20][21][22][23]. This implies that the same EEG set-up that has been normally used to identify neurological activities can also be employed for the detection of the eye movement as well. The detection of eye movement via EEG-based techniques has not been explored much to date and is still considered a wide-open area of research. In this work, a robust and automated algorithm is proposed that uses only the analysis of the EEG signal features to identify six different types of eye movements.
To the best of our knowledge, no annotated, online EEG signal databases related to the eye movement characteristics are available to date. Hence, in the present work non-invasive acquisition of EEG signal with different eye-movement positions are carried out at the institutional laboratory by placing six electrodes on the scalp of each subject. The adopted methodology initially uses a wavelet-based technique to extract two features from every lead. These features are then unified to form a single binary feature map. Finally, a single unique value derived from the binary feature map is used to identify the movement of the eye using a simple threshold-based classifier. As a whole, apart from the wavelet transform a less complicated methodology is followed at each step of the entire algorithm to make it compatible for implementation in present-day assistive devices. The proposed algorithm comes up with certain advantages as summarized below.
1. A rigorous pre-processing of the acquired EEG signal is carried out by means of a robust wavelet-based technique. The chosen DWT-based method facilitates the reduction of a wide range of noises and artefacts from the signal before further processing. 2. A simple, unique slope-change and signal averaging technique is used to eliminate the presence of eye blink-related artefacts from the acquired EEG signal. 3. The algorithm requires no feature selection method and the whole feature dimension is reduced to one single value by means of a unique binarization and feature mapping method. 4. Corresponding to each condition of the eye, the binary values in the binary feature map itself presents distinctive characteristics, as shown in Table 3. These unique combinations themselves facilitate discrimination of the eye conditions by visual inspection. That means, prior to the automated classification of the eye conditions using the unique BWFV, the proposed work also offers a visual classification of eye conditions at an earlier stage of the methodology. 5. The primary advantage of the proposed algorithm is that it requires no standard, complicated classifier. The computed single BWFV presents considerable discrimination among six different classes of eye movements. Therefore, following the extracted unique values, only a simple, linear and threshold-based classifier is found to be sufficient to classify the eye movement characteristics with significant accuracy. The use of this classifier involves no training phase, which reduces the complexity further.
Apart from its many-fold applications in the domain of neurology and psychology, the use of EEG signals also has certain drawbacks [26]. The present study uses six different leads to acquire the EEG signal data. Clearly, the necessity of an EEG acquisition cap and appropriate placement of six different EEG electrodes around the scalp often impose a certain level of discomfort to the patients. Depending on the physical and mental condition of the patients, such discomfort often imposes challenging barriers for easy acquisition of EEG data. It is to be mentioned that some contemporary literature [22,23] reports significant accuracy for the detection of eye movement with two EEG sensors only. Although, the accuracy result reported in this literature is primarily based on the use of a higher feature dimension on a comparatively smaller dataset. In response to the eye movements, the amplitude of the resulting electrical signal is found to be affected by the angular velocity also. Recent researches [21] indicate that the use of a smaller visual angle for the detection of eye movement highly affects the performance of the algorithm by causing a reduction in efficiency. Actually, less variation in the visual angle increases the likelihood of misclassification and thus affects the performance of the algorithm. Considering all the facts, a comparatively higher visual angle of thirty degrees is chosen in this study, to ensure higher accuracy and to avoid the possibility of misclassification of the eye movements. Although, the use of large visual angles often induces exhaustion in the eye and thus requires to be optimized before applying in daily life applications.
All the experimentation of the algorithm is performed and justified by executing it on the software level and no real-time hardware platform is developed in the present study. The evaluation result is prepared using MATLAB on a personal computer. The execution of the entire algorithm, starting from wavelet transformation, denoising, wavelet coefficient selection, feature extraction, binarization and finally, classification requires a total of nearly twenty seconds to classify six different types of eye movements. The mentioned execution time is computed based on the average execution time collected from all the EEG records. Clearly, the promising performance of the proposed method establishes its utility to be used in state-of-the-arts assistive devices to serve a wide variety of populations with a serious physiological barrier.

Conclusion
Human expressions conveyed by eye movement can be used to assist different forms of physical or mental impediments. In the present work, a robust and accurate algorithm is proposed that uses the analysis of the EEG signal to identify the six different types of eye movements. The adopted methodology mainly follows a robust wavelet-based technique for signal denoising and feature extraction. The extracted features from the chosen wavelet coefficients are then unified to form a single binary feature map by means of a unique binarization technique. A discriminating value derived from the binary feature map is then used to classify six different types of eye movements via a simple threshold-based classification technique. Compared to the related researches, the experimental results obtained in this work presents high average detection accuracy and also exhibit high execution speed while evaluated over a number of real-time acquired EEG signal records.
The proposed robust methodologies, high-speed execution and the promising outcome of the algorithm ensure its compatibility in cutting-edge assistive HMI devices. In the future, efforts will be made to implement the algorithm in multipurpose, real-time, personal assistive devices, which will include the facility of wireless and portable acquisition of EEG signal via minimized lead. Moreover, the algorithm will also be upgraded to incorporate other types of eye movements such as eye blinking for ease of issuing commands etc.
Funding This study did not receive any grant from any funding agencies in the public, commercial, or notfor-profit sectors.

Availability of Data and Materials
The authors have used the data which was recorded in their own laboratory.

Code Availability
The authors used their own code.

Conflict of interest
The authors declare that they have no conflict of interest.

Ethics Approval
All procedures performed in this study involving human participants were by the ethical standards of the institutional and national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Avishek Paul received his B.Tech. and M.Tech. degrees in Instrumentation Engineering from the Department of Applied Physics, University of Calcutta, Kolkata, India, in 2009 and 2011 respectively. At present, he is pursuing his Ph.D. degree in the Department of Applied Physics, University of Calcutta, India. His present research interests include biomedical signal analysis and processing. Abhishek