- Research Article
- Open Access
Combination of EEG Complexity and Spectral Analysis for Epilepsy Diagnosis and Seizure Detection
- Sheng-Fu Liang^{1, 2}Email author,
- Hsu-Chuan Wang^{1} and
- Wan-Lin Chang^{1}
https://doi.org/10.1155/2010/853434
© Sheng-Fu Liang et al. 2010
- Received: 19 August 2009
- Accepted: 6 April 2010
- Published: 16 May 2010
Abstract
Approximately 1% of the world's population has epilepsy, and 25% of epilepsy patients cannot be treated sufficiently by any available therapy. If an automatic seizure-detection system was available, it could reduce the time required by a neurologist to perform an off-line diagnosis by reviewing electroencephalogram (EEG) data. It could produce an on-line warning signal to alert healthcare professionals or to drive a treatment device such as an electrical stimulator to enhance the patient's safety and quality of life. This paper describes a systematic evaluation of current approaches to seizure detection in the literature. This evaluation was then used to suggest a reliable, practical epilepsy detection method. The combination of complexity analysis and spectrum analysis on an EEG can perform robust evaluations on the collected data. Principle component analysis (PCA) and genetic algorithms (GAs) were applied to various linear and nonlinear methods. The best linear models resulted from using all of the features without other processing. For the nonlinear models, applying PCA for feature reduction provided better results than applying GAs. The feasibility of executing the proposed methods on a personal computer for on-line processing was also demonstrated.
Keywords
- Linear Discriminant Analysis
- Principle Component Analysis
- Probabilistic Neural Network
- Epileptogenic Zone
- Ictal EEGs
1. Introduction
Epilepsy is one of the most common neurological disorders. Approximately 1% of the world's population has epilepsy, and up to 5% of people may have at least one seizure during their lifetime [1]. Epilepsy is characterized by a sudden and recurrent malfunction of the brain, a "seizure" [2]. An electroencephalogram (EEG) is a record of the electrical potentials generated by the cerebral cortex's nerve cells, and it has been an especially valuable clinical tool for the evaluation and treatment of epilepsy [3].
A continuous recording of the EEG lasting as long as one week is required to detect epilepsy. Examining the entire length of such EEG recordings by a well-trained neurologist is both tedious and time consuming. The time required by the neurologist to review extensive EEG data could be greatly reduced via assistance from a reliable automated seizure detection system. In addition, if an online seizure detection method was available, the system could signal healthcare professionals to provide immediate care when a seizure occurs, or it could drive a treatment device (e.g., an electrical stimulator or a drug delivery device) to suppress the seizure and to enhance the patient's quality of life [4, 5].
Several methods have been proposed to automatically detect epileptic seizures by analyzing EEG data. During seizures, the scalp EEG of patients with epilepsy is characterized by high-amplitude, synchronized EEG waveforms. Therefore, analysis of EEG data using chaotic nonlinear dynamics (e.g., Lyapunov exponents) [6, 7] and complexity analysis (e.g., entropy) [8–12] has been proposed to analyze seizure discharges. Time-frequency analysis approaches that analyze the fundamental frequencies and the harmonic frequencies of seizure events have also been proposed for seizure detection. These types of analyses include short-time Fourier transforms [13–15] and wavelet transforms [16]. Combinations of wavelet analysis with entropy analysis or combinations of wavelet analysis with Lyapunov exponents have also been proposed for seizure detection by analyzing the complexity of some specific EEG subbands [17, 18]. Because of their self-training capability, neural networks [12, 14, 17–20] and adaptive neurofuzzy inference systems [21] have also been utilized for classification of normal events and seizure events by analyzing the spectra and/or the complexity of the EEG recordings.
The performance of an EEG-based seizure detection model may be affected by at least four factors: the EEG features, the feature extraction/reduction methods, the classifiers, and the number of data classes to be classified. The objective of this study was to systematically evaluate the performance of current approaches described in the literature and to suggest a reliable, practical epilepsy detection method based on the findings of the evaluation. Because the dataset from Department of Epileptology, University of Bonn, Germany [21, 22], has been widely used for performance demonstration in many studies, the evaluation presented here focused on analyzing methods with this dataset for fair comparisons [8, 12, 14, 18, 23–34].
For EEG features, Srinivasan et al. successfully combined an approximate entropy (ApEn) analysis with neural networks to discriminate between normal and ictal EEG signals, and the overall accuracy was as high as 100% [12]. The study presented here examined the capability of ApEn analysis to analyze multiclass EEGs, and the analysis' performance was enhanced by combining ApEn analysis with the power of EEG subbands or autoregressive models [30]. Genetic algorithms [27] and principle component analysis [18] were compared to examine their ability to select features that are useful while using various linear and nonlinear methods for classification.
For the number of data classes to be classified, there are five datasetes in [21, 22], including normal eyes-open EEGs (Set A), normal eyes-closed EEGs (Set B), interictal EEGs at the opposite (Set C) and epileptogenic zones (Set D), and ictal EEGs (Set E) at epileptogenic zones. For two-class classification, studies described in [8, 12, 14, 23–25] classified Set A and Set E, and their accuracies ranged from 92% to 100%. Studies described in [26, 27] classified Set E and Sets A, B, C, and D, and their accuracies ranged from 96% to 97%. For three-class classification, studies described in [18, 28–30] classified Set A, Set D (Set C in [30]), and Set E, and the accuracies ranged from 85% to 96%. Finally, the studies described in [31–34] classified the five datasetes, and the accuracies ranged from 89% to 99%. In our experiments, the seizure detection approaches were also used to classify two additional datasets: Set D and Set E; and Sets A and D and Set E.
The motivation of this study is to present a comprehensive study on state-of-the-art methods for seizure detection and propose a reliable and practical epilepsy detection method that can balance computational complexity and detection accuracy. The combination of complexity analysis and spectrum analysis on an EEG that can perform robust evaluations on the collected data is proposed. Applying principle component analysis (PCA) for feature reduction and utilizing radial-basis-function support vector machine as the classifier are developed for multiclass EEG discrimination. For online seizure detection, the temporal and spectral features are integrated with the linear classifiers and can be easily implemented on current processing platforms and perform with high accuracy and with low computational cost. In addition, it is feasible for the system to responsively drive a treatment device, such as an electrical stimulator or a drug delivery device, to suppress the seizures.
2. Dataset
3. Methods
A typical EEG-based seizure detection model may contain the EEG features, the feature extraction/reduction, and the classifier. This study systematically evaluated the performance of current approaches with respect to these three elements. The results of the evaluation were used to suggest methods for achieving a practical epilepsy detection method.
For EEG features, the approximate entropy (ApEn) analysis was evaluated for its ability to analyze multiclass EEGs, and the performance of ApEn analysis was enhanced by incorporating the power of EEG subbands or autoregressive models [30]. Genetic algorithms [27] and principle component analysis [18] were compared for their ability to select features while using various linear and nonlinear methods for classification.
3.1. Spectral and Entropy Analysis
3.1.1. Approximate Entropy (ApEn)
Approximate entropy (ApEn) is a measure that quantifies the regularity or the predictability of a time series [35]. ApEn accounts for the temporal order of points in a time sequence and is therefore a preferred measure of randomness or regularity. It has also been used recently for the detection of epilepsy [8, 12]. Smaller values of ApEn imply a greater likelihood that similar patterns of measurements will be followed by additional similar measurements [35].
3.1.2. Spectral Analysis
where is the log power of frequency f. Combined with a time domain feature (i.e., the approximate entropy), a total of 16 features were used.
where represents the power of the i th band at the k th sub-window [37].
3.1.3. Autoregressive Model
The autoregressive (AR) model is a parametric model used to describe stationary time series, and it is also a popular tool for EEG analysis [38]. AR models represent the current signal as the weighted sum of its previous values and the white noise. The determination of the weights is based on the least mean square (LMS) criteria. For the analysis presented here, Akaike's information criterion [39] was used to determine the appropriate order of the AR model, which was 20. The weights of the AR model and the ApEn analysis (a total of 21 features) were combined as the set of features for use by the classifiers and were compared with the spectral features.
3.2. Feature Reduction
Principle component analysis and genetic algorithms have been utilized to perform feature reduction in seizure detection methods [18, 27]. Here, these two methods have been used in conjunction with the combination of ApEn analysis and spectral power analysis for feature reduction, and the results are compared to those obtained when examining the original 16 features without any processing.
3.2.1. Principle Component Analysis
Principal component analysis transforms a set of correlated variables into a new set of uncorrelated variables that can use fewer dimensions to express the relevant information contained in the observation data. It has also been widely used in EEG analysis for dimension reduction or for feature extraction [18, 37]. For this analysis, PCA was applied to the 16 features (ApEn and the powers of the 15 subbands), and the resultant principle components (PCs) were fed to the classifiers for evaluation. The number of PCs was determined based on the best performance of each classifier [18].
3.2.2. Genetic Algorithm
A Genetic Algorithm (GA) is an adaptive heuristic search algorithm. It starts with an initial population of fixed-length individuals (chromosomes). The evolution process is governed by selection, crossover, and mutation [40] of the parents to generate the children's generation. A fitness function is defined to evaluate how well a solution (i.e., the individuals) solves the problem. Here, the settings and the procedure for the GA followed the approach proposed in [27] for seizure detection. At initialization, the population size was set at 20. Each individual consisted of 16 genes that represented each of the 16 features (i.e., the ApEn and the powers of the 15 subbands). The genes were allowed to have a value of 0 or 1. The value of 1 implied that the corresponding feature was selected, and a value of 0 implied that the corresponding feature was excluded. The fitness value of an individual was defined as the inverse value of the classification error. Two of the individuals (the elites) in the current generation were quarantined to survive to the next generation without any modifications. In each generation, 80% of the individuals in the population, excluding the elites, were created through a crossover operation. The remaining 20% were generated through mutation. The algorithm was allowed to run for a maximum of 100 generations. The stop criteria for the algorithm were set such that the algorithm would halt if there were no improvements in the fitness values from 20 consecutive generations [27].
3.3. Classification
To evaluate and compare the performance of analysis methods, four linear and nonlinear methods were utilized to classify the extracted features as a seizure or a Nonseizure event. The four evaluated methods were linear least squares [41], linear discriminate analysis [42], a backpropagation (BP) neural network [43], and the support vector machine with either the linear (LISVM) or the radial-basis function kernels (RBFSVM) [44].
The linear least squares (LLS) method finds a best fitting linear model that minimizes the mean square error between the system output and the desired output. Mathematically, it can be stated as finding an approximate solution to an overdetermined system of linear equations. Because the model output is only the weighted sum of the input features, it is suitable for implementation using processors without high computing power or for use in online processing. Linear discriminant analysis (LDA) uses a hyperplane to find the linear combination of features that best separates two or more classes of objects or events. Usually, the within-class, between-class, and mixture scatter matrices are used to formulate the criteria for searching the hyperplane so that the distance between the classes' means is minimized and the interclass variance is maximized [45, 46].
Backpropagation (BP) neural networks are widely used nonlinear models for pattern recognition and classification problems. BP is a multilayer perceptron that is composed of several layers of neurons. The error between the desired output and the network output is backpropagated from the output layer to the hidden and input layers to update the weights for network training based on the gradient-descent method. Here, a 3-layer feedforward neural network was utilized, and the number of neurons in the hidden layer was 20. The log-sigmoid function was used as the activation function of the hidden and output layers. The learning constant was 0.1 for network training and iteration number is 2000 for network training.
The support vector machine (SVM) also uses a hyperplane to identify classes. The hyperplane that maximizes the margin (i.e., the distance from the nearest training points) is selected by the SVM. Maximizing the margins is known to increase the method's generalization capabilities. The SVM performs structural risk minimization and creates a classifier with a minimized Vapnik-Chervonenkis (VC) dimension. When the VC dimension is low, the expected probability of error is low and ensures a good generalization. The SVM can also simultaneously minimize the empirical risk and the expected risk of pattern classification problems [47]. For the analysis presented here, two kinds of kernels, the linear kernel and the radial-basis function, were used. The RBF kernel nonlinearly maps samples into a higher dimensional space to handle cases where the relation between class labels and attributes is nonlinear. The parameter settings used were the following: a penalty parameter, a variance, for RBFSVM; and for LISVM.
4. Results
The experiments consisted of two parts: Epilepsy Diagnosis based on classification of three or five EEG datasetes, including normal EEGs, interictal EEGs, and ictal EEGs and Seizure Detection based on classifying the windowed EEG trials as ictal or nonictal. The results demonstrated the feasibility of the seizure detection method for use in online seizure detection. This experiment contained three subexperiments. The first was to distinguish ictal EEGs (Set E) and nonictal EEGs (Sets A and D). The second was to distinguish ictal EEGs (Set E) and interictal EEGs (Set D). The third was to distinguish ictal EEGs (Set E) and nonictal EEGs (Sets A, B, C, and D) when all of the datasets provided in [22] are used.
4.1. Classification
Groupwise average accuracies of various feature extraction methods combined with various classifiers applied to classify sets A, D, and E (standard deviations are noted in parentheses).
Classifier | Feature selection | A | D | E | Accuracy |
---|---|---|---|---|---|
LLS | All features | 100.00 (0.0) | 95.00 (3.9) | 95.50 (1.6) | 96.83 (1.2) |
GA | 99.25 (1.2) | 94.50 (3.3) | 94.75 (3.6) | 96.17 (1.9) | |
ApEn + AR model | 98.50 (2.4) | 95.50 (2.6) | 95.50 (3.1) | 96.50 (1.3) | |
LDA | All features | 100.00 (0.0) | 94.50 (3.9) | 95.75 (1.7) | 96.75 (1.1) |
GA | 99.50 (1.1) | 95.00 (4.4) | 94.50 (2.3) | 96.33 (1.9) | |
ApEn + AR model | 97.50 (2.6) | 96.00 (2.9) | 95.75 (3.1) | 96.41 (1.6) | |
BP | All features | 98.75 (3.1) | 96.50 (2.9) | 97.00 (2.0) | 97.42 (1.4) |
PCA | 100.0 (0.0) | 97.75 (1.8) | 97.00 (2.6) | 98.25 (1.5) | |
GA | 98.50 (1.7) | 91.50 (5.2) | 97.50 (2.4) | 95.83 (2.0) | |
ApEn + AR model | 99.25 (1.2) | 93.00 (5.5) | 89.50 (6.0) | 93.92 (3.0) | |
LISVM | All features | 99.50 (1.1) | 97.00 (2.6) | 98.25 (1.2) | 98.25 (1.1) |
PCA | 99.75 (0.8) | 98.00 (2.0) | 97.25 (1.4) | 98.33 (0.6) | |
GA | 98.75 (2.1) | 93.50 (5.0) | 98.00 (1.1) | 96.75 (1.7) | |
ApEn + AR model | 99.75 (0.8) | 94.25 (3.1) | 94.50 (5.0) | 96.17 (2.2) | |
RBFSVM | All features | 99.75 (0.8) | 97.75 (1.8) | 97.75 (1.4) | 98.42 (0.8) |
PCA | 99.75 (0.8) | 98.25 (1.8) | 98.00 (1.6) | 98.67 (0.7) | |
GA | 99.50 (1.6) | 95.00 (3.7) | 96.50 (2.7) | 97.00 (1.7) | |
ApEn + AR model | 99.75 (0.8) | 92.25 (3.2) | 91.75 (3.9) | 94.58 (1.8) |
Groupwise average accuracies of various feature extraction methods combined with various classifiers applied to classify sets A, B, C, D, and E (standard deviations are noted in parentheses).
Classifier | Feature Selection | A | B | C | D | E | Accuracy |
---|---|---|---|---|---|---|---|
LLS | All features | 94.75 (3.6) | 88.75 (5.0) | 51.00 (8.6) | 60.00 (8.0) | 92.25 (3.6) | 77.95 (1.7) |
LDA | All features | 96.00 (2.9) | 88.25 (5.3) | 51.00 (8.6) | 62.00 (8.1) | 94.25 (4.1) | 78.30 (1.7) |
BP | All features | 95.75 (3.1) | 91.50 (3.8) | 59.25 (8.8) | 63.75 (9.3) | 94.00 (2.7) | 80.85 (1.5) |
PCA | 94.75 (4.9) | 92.50 (3.3) | 63.25 (7.0) | 70.50 (9.5) | 93.00 (2.8) | 82.80 (2.5) | |
LISVM | All features | 94.75 (6.5) | 92.00 (4.4) | 59.50 (5.8) | 64.50 (6.5) | 95.25 (1.4) | 81.20 (1.4) |
PCA | 94.50 (4.4) | 91.75 (4.7) | 62.50 (7.7) | 63.00 (9.1) | 94.50 (3.5) | 81.25 (2.2) | |
RBFSVM | All features | 97.00 (4.4) | 93.00 (3.5) | 71.75 (6.5) | 70.00 (8.7) | 96.00 (3.4) | 85.55 (2.0) |
PCA | 94.25 (7.3) | 92.50 (3.7) | 75.75 (7.4) | 72.00 (6.7) | 95.00 (3.3) | 85.90 (2.4) |
4.2. Toward an Online Seizure Detection System
Total number of windows corresponding to various window lengths used for training and testing the classifiers for seizure detection.
Sets | Window size | Number of windows for training | Number of windows for testing | ||
---|---|---|---|---|---|
Nonseizure | Seizure | Nonseizure | Seizure | ||
D-E | 173 | 1380 | 1380 | 920 | 920 |
256 | 960 | 960 | 640 | 640 | |
512 | 480 | 480 | 320 | 320 | |
AD-E | 173 | 2760 | 1380 | 1840 | 920 |
256 | 1920 | 960 | 1280 | 640 | |
512 | 960 | 480 | 640 | 320 | |
ABCD-E | 173 | 5520 | 1380 | 3680 | 920 |
256 | 3840 | 960 | 2560 | 640 | |
512 | 1920 | 480 | 1280 | 320 |
Sensitivity, specificity, and accuracy of combining various feature extraction methods with various classifiers to distinguishing between Set E (ictal) and Set D (interictal).
Window size | Classifier | Feature selection | Sensitivity | Specificity | Accuracy |
---|---|---|---|---|---|
173 | LLS | All features | 95.98 (1.4) | 95.14 (1.0) | 95.56 (0.8) |
GA | 95.90 (1.3) | 94.66 (1.6) | 95.28 (0.9) | ||
ApEn + AR model | 95.88 (0.8) | 94.85 (1.6) | 95.36 (0.8) | ||
LDA | All features | 95.98 (1.4) | 95.14 (1.0) | 95.56 (0.8) | |
GA | 95.79 (0.9) | 94.82 (1.3) | 95.30 ( 0.7) | ||
ApEn + AR model | 95.88 (0.8) | 94.85 (1.6) | 95.36 (0.8) | ||
BP | All features | 95.90 (0.9) | 95.41 (1.2) | 95.66 (0.7) | |
PCA | 95.89 (1.0) | 94.65 (1.2) | 95.27 (0.6) | ||
GA | 95.67 (1.3) | 94.49 (1.9) | 95.08 (1.1) | ||
ApEn + AR model | 93.47 (2.0) | 94.95 (1.4) | 94.21 (1.2) | ||
LISVM | All features | 95.68 (1.0) | 95.62 (1.2) | 95.65 (0.8) | |
PCA | 95.46 (1.4) | 95.16 (1.1) | 95.31 (0.8) | ||
GA | 95.27 (1.5) | 95.29 (1.0) | 95.28 (0.9) | ||
ApEn + AR model | 94.68 (1.2) | 94.73 (1.6) | 94.71 (1.0) | ||
RBFSVM | All features | 96.04 (1.1) | 95.29 (1.2) | 95.67 (0.7) | |
PCA | 95.95 (1.5) | 95.09 (1.2) | 95.52 (0.9) | ||
GA | 95.86 (1.0) | 95.32 (1.8) | 95.59 (0.9) | ||
ApEn + AR model | 92.84 (2.9) | 93.54 (1.8) | 93.19 (1.6) | ||
256 | LLS | All features | 95.48 (1.8) | 96.20 (1.5) | 95.84 (1.2) |
GA | 95.27 (1.7) | 95.88 (1.3) | 95.57 (1.2) | ||
ApEn + AR model | 96.23 (0.6) | 95.08 (2.2) | 95.66 (1.2) | ||
LDA | All features | 95.48 (1.8) | 96.20 (1.5) | 95.84 (1.2) | |
GA | 94.64 (1.8) | 96.34 (1.2) | 95.49 (1.2) | ||
ApEn + AR model | 96.23 (0.6) | 95.08 (2.2) | 95.66 (1.2) | ||
BP | All features | 96.03 (1.2) | 95.70 (1.3) | 95.87 (0.9) | |
PCA | 95.72 (1.1) | 96.03 (1.3) | 95.88 (0.6) | ||
GA | 95.75 (1.4) | 95.03 (1.5) | 95.39 (0.9) | ||
ApEn + AR model | 93.34 (2.5) | 95.58 (1.6) | 94.46 (1.3) | ||
LISVM | All features | 95.61 (1.3) | 96.41 (1.4) | 96.01 (0.9) | |
PCA | 95.19 (1.2) | 96.11 (1.6) | 95.65 (1.0) | ||
GA | 95.19 (1.2) | 95.91 (1.4) | 95.55 (0.7) | ||
AR model | 94.67 (1.7) | 95.06 (1.9) | 94.87 (1.4) | ||
RBFSVM | All features | 96.34 (1.4) | 96.53 (1.3) | 96.44 (0.9) | |
PCA | 96.28 (1.3) | 96.38 (1.4) | 96.33 (1.0) | ||
GA | 95.67 (1.2) | 95.81 (1.3) | 95.74 (0.9) | ||
ApEn + AR model | 92.39 (3.8) | 93.88 (2.1) | 93.13 (2.1) | ||
512 | LLS | All features | 96.03 (2.0) | 96.56 (1.8) | 96.30 (1.4) |
GA | 95.91 (2.0) | 96.47 (1.8) | 96.19 (1.4) | ||
ApEn + AR model | 96.44 (0.8) | 95.38 (2.1) | 95.91 (1.1) | ||
LDA | All features | 96.30 (2.0) | 96.56 (1.8) | 96.30 (1.4) | |
GA | 96.03 (1.9) | 96.03 (1.7) | 96.03 (1.4) | ||
ApEn + AR model | 96.44 (0.8) | 95.38 (2.1) | 95.91 (1.1) | ||
BP | All features | 96.63 (1.3) | 95.88 (1.7) | 96.25 (1.0) | |
PCA | 96.44 (1.5) | 95.78 (1.3) | 96.11 (0.8) | ||
GA | 96.41 (1.9) | 95.47 (1.8) | 95.94 (0.7) | ||
ApEn + AR model | 94.25 (1.4) | 95.56 (2.0) | 94.91 (1.1) | ||
LISVM | All features | 96.91 (1.2) | 96.59 (1.6) | 96.75 (1.0) | |
PCA | 94.47 (3.6) | 98.57 (0.8) | 97.75 (0.6) | ||
GA | 96.88 (1.4) | 96.16 (1.7) | 96.52 (1.0) | ||
ApEn + AR model | 93.94 (2.4) | 94.88 (2.6) | 94.41 (1.7) | ||
RBFSVM | All features | 97.22 (1.3) | 96.63 (1.6) | 96.92 (1.1) | |
PCA | 97.00 (2.0) | 99.18 (0.6) | 98.74 (0.5) | ||
GA | 96.44 (1.5) | 96.13 (1.4) | 96.28 (0.9) | ||
ApEn + AR model | 92.00 (4.1) | 94.53 (2.6) | 93.27 (2.3) |
Sensitivity, specificity, and accuracy of combining various feature extraction methods with various classifiers to distinguishing between Set E (ictal windows) and Sets (A, D) (normal and interictals).
Window size | Classifier | Feature Selection | Sensitivity | Specificity | Accuracy |
---|---|---|---|---|---|
173 | LLS | All features | 93.61 (2.1) | 97.79 (0.8) | 96.40 (0.6) |
GA | 92.85 (2.5) | 97.69 (0.8) | 96.08 (0.7) | ||
ApEn + AR model | 92.53 (1.8) | 98.10 (0.7) | 96.25 (0.7) | ||
LDA | All features | 95.60 (1.4) | 97.32 (0.9) | 96.74 (0.5) | |
GA | 95.67 (1.7) | 96.99 (1.2) | 96.55 (0.7) | ||
ApEn + AR model | 94.03 (1.7) | 97.55 (0.8) | 96.38 (0.7) | ||
BP | All features | 94.59 (2.5) | 97.42 (1.2) | 96.47 (0.8) | |
PCA | 94.26 (2.3) | 97.91 (0.8) | 96.70 (0.8) | ||
GA | 89.34 (3.9) | 98.45 (0.5) | 96.63 (0.5) | ||
ApEn + AR model | 93.97 (2.4) | 97.38 (1.1) | 96.24 (0.7) | ||
LISVM | All features | 94.75 (1.8) | 97.84 (0.9) | 96.81 (0.5) | |
PCA | 93.91 (2.2) | 97.64 (0.8) | 96.40 (0.6) | ||
GA | 93.97 (2.6) | 97.62 (1.0) | 96.40 (0.9) | ||
ApEn + AR model | 94.36 (2.1) | 97.42 (1.0) | 96.40 (0.7) | ||
RBFSVM | All features | 95.22 (1.9) | 97.98 (0.9) | 97.06 (0.6) | |
PCA | 98.85 (1.9) | 97.90 (0.8) | 96.88 (0.6) | ||
GA | 94.75 (2.2) | 97.72 (1.1) | 96.73 (0.8) | ||
ApEn + AR model l | 94.07 (1.8) | 97.23 (1.1) | 96.18 (0.6) | ||
256 | LLS | All features | 94.92 (1.5) | 98.37 (0.8) | 97.22 (0.5) |
GA | 94.33 (1.7) | 98.15 (0.7) | 96.88 (0.4) | ||
ApEn + AR model | 93.39 (1.8) | 98.08 (0.8) | 96.49 (0.7) | ||
LDA | All features | 96.06 (1.2) | 97.77 (1.0) | 97.20 (0.5) | |
GA | 95.92 (1.8) | 97.24 (1.3) | 96.80 (0.6) | ||
ApEn + AR model | 96.36 (1.7) | 97.51 (0.9) | 96.46 (0.7) | ||
BP | All features | 95.89 (2.1) | 97.79 (1.4) | 97.16 (0.9) | |
PCA | 95.25 (1.5) | 98.58 (0.7) | 97.47 (0.5) | ||
GA | 95.75 (3.0) | 97.35 (1.4) | 96.82 (1.1) | ||
ApEn + AR model | 94.16 (2.6) | 97.80 (1.2) | 96.58 (0.6) | ||
LISVM | All features | 96.19 (1.8) | 98.23 (1.0) | 97.55 (0.6) | |
PCA | 95.08 (1.9) | 98.20 (0.9) | 97.16 (0.5) | ||
GA | 95.45 (2.6) | 97.91 (1.2) | 97.09 (0.9) | ||
ApEn + AR model | 94.83 (2.0) | 97.62 (1.0) | 96.69 (0.6) | ||
RBFSVM | All features | 96.88 (1.8) | 98.41 (1.0) | 97.90 (0.7) | |
PCA | 96.67 (1.7) | 98.43 (1.0) | 97.83 (0.6) | ||
GA | 96.06 (2.5) | 98.17 (0.9) | 97.47 (0.7) | ||
ApEn + AR model | 93.80 (2.2) | 97.09 (1.2) | 95.99 (1.7) | ||
512 | LLS | All features | 95.59 (1.3) | 98.58 (0.7) | 97.58 (0.3) |
GA | 95.50 (2.0) | 98.39 (1.0) | 97.43 (0.9) | ||
ApEn + AR model | 93.94 (2.0) | 98.13 (0.8) | 96.73 (0.7) | ||
LDA | All features | 97.03 (1.0) | 98.13 (0.9) | 97.76 (0.4) | |
GA | 96.81 (1.7) | 97.78 (0.9) | 97.46 (0.9) | ||
ApEn + AR model | 94.59 (1.9) | 97.77 (0.9) | 96.71 (0.7) | ||
BP | All features | 97.44 (1.5) | 97.81 (1.4) | 97.69 (0.9) | |
PCA | 96.78 (1.6) | 98.47 (0.9) | 97.90 (0.5) | ||
GA | 95.91 (2.7) | 98.05 (1.3) | 97.33 (1.0) | ||
ApEn + AR model | 95.00 (1.8) | 97.88 (1.0) | 96.92 (0.6) | ||
LISVM | All features | 97.28 (1.4) | 98.66 (0.8) | 98.20 (0.5) | |
PCA | 96.38 (1.3) | 98.47 (0.9) | 97.77 (0.5) | ||
GA | 96.88 (2.0) | 98.33 (1.1) | 97.84 (0.7) | ||
ApEn + AR model | 94.69 (2.5) | 97.70 (0.9) | 96.70 (0.8) | ||
RBFSVM | All features | 97.53 (1.4) | 98.73 (1.0) | 98.33 (0.7) | |
PCA | 97.44 (1.5) | 98.77 (0.9) | 98.32 (0.6) | ||
GA | 97.37 (1.9) | 98.53 (0.9) | 98.15 (0.8) | ||
ApEn + AR model | 93.63 (2.4) | 97.28 (1.3) | 96.06 (1.0) |
Sensitivity, specificity, and accuracy of combining various feature extraction methods with various classifiers to distinguishing between Set E (ictal windows) and all nonictal windows (Sets A, B, C, and D).
Window size | Classifier | Selection feature | Sensitivity | Specificity | Accuracy |
---|---|---|---|---|---|
173 | LLS | All features | 88.40 (2.9) | 98.67 (0.3) | 96.62 (0.4) |
GA | 89.03 (2.6) | 98.67 (0.3) | 96.74 (0.4) | ||
ApEn + AR model | 86.13 (3.5) | 99.17 (0.2) | 96.56 (0.6) | ||
LDA | All features | 93.71 (1.8) | 97.90 (0.5) | 97.07 (0.3) | |
GA | 93.84 (1.6) | 97.82 (0.5) | 97.03 (0.3) | ||
ApEn + AR model | 89.87 (3.1) | 98.53 (0.3) | 96.80 (0.5) | ||
BP | All features | 89.46 (3.5) | 98.57 (0.4) | 96.74 (0.5) | |
PCA | 92.26 (2.2) | 98.56 (0.3) | 97.30 (0.4) | ||
GA | 89.34 (3.9) | 98.45 (0.5) | 96.63 (0.5) | ||
ApEn + AR model | 91.83 (3.1) | 98.67 (0.5) | 97.30 (0.5) | ||
LISVM | All features | 92.75 (1.9) | 98.43 (0.5) | 97.29 (0.3) | |
PCA | 91.89 (2.2) | 98.34 (0.5) | 97.05 (0.3) | ||
GA | 92.42 (2.1) | 98.41 (0.5) | 97.22 (0.3) | ||
ApEn + AR model | 98.67 (0.4) | 91.79 (2.7) | 97.29 (0.5) | ||
RBFSVM | All features | 93.09 (1.9) | 98.61 (0.5) | 97.50 (0.3) | |
PCA | 92.49 (2.5) | 98.53 (0.4) | 97.46 (0.3) | ||
GA | 92.86 (2.2) | 98.40 (0.4) | 97.30 (0.3) | ||
ApEn + AR model | 92.09 (2.6) | 98.35 (0.4) | 97.10 (0.4) | ||
256 | LLS | All features | 89.23 (3.3) | 99.35 (0.3) | 97.33 (0.6) |
GA | 89.33 (3.1) | 99.30 (0.3) | 97.31 (0.5) | ||
ApEn + AR model | 86.72 (3.8) | 99.24 (0.3) | 96.74 (0.7) | ||
LDA | All features | 93.45 (2.2) | 98.73 (0.4) | 97.67 (0.3) | |
GA | 94.03 (2.2) | 98.66 (0.5) | 97.73 (0.4) | ||
ApEn + AR model | 90.14 (3.5) | 98.67 (0.3) | 96.96 (0.7) | ||
BP | All features | 92.27 (3.2) | 98.68 (0.6) | 97.40 (0.4) | |
PCA | 92.55 (2.3) | 99.09 (0.5) | 97.78 (0.4) | ||
GA | 89.84 (3.9) | 98.83 (0.6) | 97.03 (0.7) | ||
ApEn + AR model | 92.39 (2.7) | 98.91 (0.5) | 97.61 (0.5) | ||
LISVM | All features | 93.30 (2.2) | 98.97 (0.6) | 97.84 (0.3) | |
PCA | 93.00 (2.4) | 98.96 (0.5) | 97.77 (0.3) | ||
GA | 92.97 (2.4) | 98.84 (0.6) | 97.67 (0.4) | ||
ApEn + AR model | 92.47 (2.6) | 98.77 (0.4) | 97.51 (0.5) | ||
RBFSVM | All features | 94.69 (1.9) | 99.10 (0.6) | 98.21 (0.5) | |
PCA | 94.48 (1.9) | 99.12 (0.5) | 98.19 (0.5) | ||
GA | 94.34 (2.1) | 99.03 (0.5) | 98.09 (0.4) | ||
ApEn + AR model | 92.48 (2.6) | 98.48 (0.5) | 97.28 (0.5) | ||
512 | LLS | All features | 91.40 (2.7) | 99.42 (0.2) | 97.82 (0.5) |
GA | 90.91 (4.2) | 99.32 (0.3) | 97.64 (0.7) | ||
ApEn + AR model | 87.31 (0.4) | 99.32 (0.3) | 96.92 (0.8) | ||
LDA | All features | 94.25 (2.1) | 99.04 (0.4) | 98.08 (0.3) | |
GA | 94.66 (2.2) | 98.88 (0.5) | 98.04 (0.4) | ||
ApEn + AR model | 90.59 (3.5) | 98.79 (0.4) | 97.15 (0.7) | ||
BP | All features | 92.16 (3.6) | 98.72 (0.5) | 97.41 (0.7) | |
PCA | 94.94 (2.0) | 99.09 (0.5) | 98.26 (0.4) | ||
GA | 93.63 (2.9) | 98.95 (0.3) | 97.88 (0.4) | ||
ApEn + AR model | 93.75 (2.4) | 98.88 (0.6) | 97.85 (0.5) | ||
LISVM | All features | 95.28 (1.8) | 99.13 (0.5) | 98.36 (0.4) | |
PCA | 94.72 (2.1) | 99.07 (0.5) | 98.20 (0.3) | ||
GA | 94.44 (3.4) | 98.98 (0.5) | 98.08 (0.7) | ||
ApEn + AR model | 93.47 (2.8) | 98.86 (0.4) | 97.78 (0.6) | ||
RBFSVM | All features | 95.78 (1.8) | 99.19 (0.6) | 98.51 (0.5) | |
PCA | 95.78 (1.8) | 99.19 (0.6) | 98.51 (0.5) | ||
GA | 95.47 (1.4) | 99.07 (0.5) | 98.35 (0.4) | ||
ApEn + AR model | 92.43 (3.4) | 98.58 (0.6) | 97.35 (0.7) |
Table 4 shows that the performance of linear classifiers was similar to the nonlinear models, and the average accuracy of seizure detection could reach roughly 96% for the 173-pt EEG windows (a data window of approximately 1 second). The detection accuracy increased to 98% for the 512-pt EEG windows (a data window of approximately 3 seconds). For the linear models, utilizing all of the features obtained the best accuracy, and features extracted by PCA provided for the best accuracy from the nonlinear models. The RBFSVM had the highest average accuracy when combined with PCA.
Time consumption of the proposed method implemented on a personal computer (PC) to analyze EEG data corresponding to different data length. The program was coded by C language. The processor is Intel core 2 6600 operated in 2.4 GHz and the RAM size is 2 G.
Window size | Execution time (ms) | FFT | Entropy | LLS/LDA | Total |
---|---|---|---|---|---|
173 (1 s) | 1000 cycles/one cycle | 109/0.109 ms | 422/0.422 ms | 16/0.016 ms | 547/0.547 ms |
256 (1.48 s) | 1000 cycles/one cycle | 125/0.125 ms | 453/0.453 ms | 16/0.016 ms | 594/0.594 ms |
512 (2.95 s) | 1000 cycles/one cycle | 172/0.172 ms | 844/0.844 ms | 16/0.016 ms | 1032/1.032 ms |
5. Discussion
Classes | Authors (year) | Method | Dataset | Accuracy |
---|---|---|---|---|
2 | Nigam et al. [23] (2004) | Nonlinear preprocessing filter, diagnostic artificial neural network (LAMSTAR) | A, E | 97.2 |
Srinivasan et al. [14] (2005) | Time & frequency domain features, recurrent neural network (RNN) | A, E | 99.6 | |
Kannathal et al. [8] (2005) | Entropy measures, adaptive neurofuzzy inference system (ANFIS) | A, E | 92.22 | |
Polat et al. [24] (2006) | Fast Fourier transform (FFT), decisiontree (DT) | A, E | 98.72 | |
Subasi [25] (2007) | Discrete wavelet transform (DWT), mixture of expert model | A, E | 95 | |
Srinivasan et al. [12] (2007) | Approximate entropy, artificial neural network | A, E | 100 | |
Tzallas et al. [26] (2007) | Time frequency (TF) analysis, artificial neural network (ANN) | (A, B, C, D), E | 97.73 | |
Ocak [27] (2008) | Approximate entropy & discrete wavelet transform (DWT), genetic algorithm(GA) | (A, B, C, D), E | 96.15 | |
This paper | Time frequency & approximate entropy analysis, linear or nonlinear classifiers | (A, B, C, D), E | 97.82–98.51 | |
3 | Guler et al. [28] (2005) | Lyapunov exponents, recurrent neural network (RNN) | A, D, E | 96.79 |
Sadati et al. [29] (2006) | Discrete wavelet transform (DWT), adaptive neural fuzzy network (ANFN) | A, D, E | 85.9 | |
Ghosh-Dastidat et al. [18] (2008) | Chaos theory and wavelet analysis, PCA, radical basis function neural network | A, D, E | 96.73 | |
Mousavi et al. [30] (2008) | AR model, wavelet decomposition, MLP classifier | A, C, E | 96 | |
This paper | Time frequency & approximate entropy analysis, linear or nonlinear classifiers | A, D, E | 96.83–98.67 | |
5 | Güler et al. [32] (2005) | Wavelet transform, adaptive neurofuzzy inference system | A, B, C, D, E | 98.68 |
Güler et al. [33] (2007) | Wavelet transform, Lyapunov exponents, support vector machine | A, B, C, D, E | 99.28 | |
Übeyli et al. [31] (2007) | Eigenvector methods, Mixture of expert models | A, B, C, D, E | 98.60 | |
Tzallas et al. [34] (2009) | Time frequency (TF) analysis, artificial neural network (ANN) | A, B, C, D, E | 89 | |
This paper | Time frequency & approximate entropy analysis, RBFSVM | A, B, C, D, E | 85.9 |
For the three-class discrimination, combining median-filtered ApEn data and the multiband EEG power spectra led to average accuracies ranging from 96.83%–98.67% while using the linear and nonlinear classifying methods. These accuracies were superior to those from the related methods that utilize recurrent networks [28], adaptive neural fuzzy networks [29], or radial basis-function neural networks [18].
For the five-class discrimination, the average accuracy of the proposed features combined with RBFSVM is 85.9%. Our best result can reach 90% (close to the results in [34]) among the ten tests but is not satisfactory. The best result in literature is 99.28% reported in [33]. For our approach, most of errors were the misclassifications between Sets C and D, interictal EEGs of epileptic patients recorded by intracranial electrodes, so the misclassifications do not affect the applicability of the developed methods to much in epilepsy diagnosis (discriminating between the EEGs of healthy people and the epileptic-seizure EEGs of patients) and seizure detection (discriminating between the ictal and nonictal EEGs of epileptic patients).
For the two-class classification that distinguishes between Set E and Sets (A, B, C, D) [26, 27], the approach presented here achieved the best average accuracies, which were 1%-2% higher than the methods described in the literature. The comparisons showed that integrating temporal and spectral features with linear classifiers can perform with high accuracy and with low computational cost to achieve epilepsy diagnosis or seizure detection.
Real-time operation is also an issue for an online seizure warning or seizure control system. The operations, including FFT analysis, approximate entropy analysis, and the LLS/LDA classification method, could be easily implemented on current processing platforms designed for various online applications. The results described here and in the literature (Table 8) were obtained using a database selected and cut out from continuous, multichannel EEG recordings after a visual inspection for artifacts such as muscle activity or eye movement. Further study to evaluate the performance of the seizure detection methods using continuous EEG recordings encompassing various behaviors and physiological states is required for development of clinical applications. In addition, these methods may require modification if they are applied to other types of seizures, such as absence seizures that have rhythmic oscillations on fundamental and harmonic frequency bands of the scalp EEG [48, 49].
Declarations
Acknowledgment
This work was supported by the National Science Council of Taiwan under Grants nos.NSC 97-2627-B-006-005 and 98-2627-B-006-001.
Authors’ Affiliations
References
- Engel J: Seizure and Epilepsy. Davis, Philadelphia, Pa, USA; 1989.Google Scholar
- Lehnertz K, Mormann F, Kreuz T, et al.: Seizure prediction by nonlinear EEG analysis. IEEE Engineering in Medicine and Biology Magazine 2003, 22(1):57-63. 10.1109/MEMB.2003.1191451View ArticleGoogle Scholar
- Subasi A: Epileptic seizure detection using dynamic wavelet network. Expert Systems with Applications 2005, 29(2):343-355. 10.1016/j.eswa.2005.04.007View ArticleGoogle Scholar
- Iasemidis LD: Epileptic seizure prediction and control. IEEE Transactions on Biomedical Engineering 2003, 50(5):549-558. 10.1109/TBME.2003.810705View ArticleGoogle Scholar
- Stacey WC, Litt B: Technology insight: neuroengineering and epilepsy—designing devices for seizure control. Nature Clinical Practice Neurology 2008, 4(4):190-201.View ArticleGoogle Scholar
- Iasemidis LD, Zaveri HP, Sackellares JC, Williams WJ, Hood TW: Nonlinear dynamics of electrocorticographic data. Journal of Clinical Neurophysiology 1988, 5: 339.View ArticleGoogle Scholar
- Iasemidis LD, Sackellares JC, Zaveri HP, Williams WJ: Phase space topography and the lyapunov exponent of electrocorticograms in partial seizures. Brain Topography 1990, 2(3):187-201. 10.1007/BF01140588View ArticleGoogle Scholar
- Kannathal N, Choo ML, Acharya UR, Sadasivan PK: Entropies for detection of epilepsy in EEG. Computer Methods and Programs in Biomedicine 2005, 80(3):187-194. 10.1016/j.cmpb.2005.06.012View ArticleGoogle Scholar
- Li X, Ouyang G, Richards DA: Predictability analysis of absence seizures with permutation entropy. Epilepsy Research 2007, 77(1):70-74. 10.1016/j.eplepsyres.2007.08.002View ArticleGoogle Scholar
- Rosso OA, Blanco S, Yordanova J, et al.: Wavelet entropy: a new tool for analysis of short duration brain electrical signals. Journal of Neuroscience Methods 2001, 105(1):65-75. 10.1016/S0165-0270(00)00356-3View ArticleGoogle Scholar
- Rosso OA: Entropy changes in brain function. International Journal of Psychophysiology 2007, 64(1):75-80. 10.1016/j.ijpsycho.2006.07.010View ArticleGoogle Scholar
- Srinivasan V, Eswaran C, Sriraam N: Approximate entropy-based epileptic EEG detection using artificial neural networks. IEEE Transactions on Information Technology in Biomedicine 2007, 11(3):288-295.View ArticleGoogle Scholar
- Van Hese P, Martens J-P, Boon P, Dedeurwaerdere S, Lemahieu I, Van de Walle R: Detection of spike and wave discharges in the cortical EEG of genetic absence epilepsy rats from Strasbourg. Physics in Medicine and Biology 2003, 48(12):1685-1700. 10.1088/0031-9155/48/12/302View ArticleGoogle Scholar
- Srinivasan V, Eswaran C, Sriraam AN: Artificial neural network based epileptic detection using time-domain and frequency-domain features. Journal of Medical Systems 2005, 29(6):647-660. 10.1007/s10916-005-6133-1View ArticleGoogle Scholar
- Schuyler R, White A, Staley K, Cios KJ: Epileptic seizure detection. IEEE Engineering in Medicine and Biology Magazine 2007, 26(2):74-81.View ArticleGoogle Scholar
- Bosnyakova D, Gabova A, Zharikova A, Gnezditski V, Kuznetsova G, van Luijtelaar G: Some peculiarities of time-frequency dynamics of spike-wave discharges in humans and rats. Clinical Neurophysiology 2007, 118(8):1736-1743. 10.1016/j.clinph.2007.04.013View ArticleGoogle Scholar
- Ghosh-Dastidar S, Adeli H, Dadmehr N: Mixed-band wavelet-chaos-neural network methodology for epilepsy and epileptic seizure detection. IEEE Transactions on Biomedical Engineering 2007, 54(9):1545-1551.View ArticleGoogle Scholar
- Ghosh-Dastidar S, Adeli H, Dadmehr N: Principal component analysis-enhanced cosine radial basis function neural network for robust epilepsy and seizure detection. IEEE Transactions on Biomedical Engineering 2008, 55(2):512-518.View ArticleGoogle Scholar
- Alkan A, Koklukaya E, Subasi A: Automatic seizure detection in EEG using logistic regression and artificial neural network. Journal of Neuroscience Methods 2005, 148(2):167-176. 10.1016/j.jneumeth.2005.04.009View ArticleGoogle Scholar
- Acır N, Öztura İ, Kuntalp M, Baklan B, Güzeliş C: Automatic detection of epileptiform events in EEG by a three-stage procedure based on artificial neural networks. IEEE Transactions on Biomedical Engineering 2005, 52(1):30-40. 10.1109/TBME.2004.839630View ArticleGoogle Scholar
- Andrzejak RG, Lehnertz K, Mormann F, Rieke C, David P, Elger CE: Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. Physical Review E 2001, 64:-8.Google Scholar
- Andrzejak RG: EEG time series download page. http://www.meb.uni-bonn.de/epileptologie/cms/upload/workgroup/lehnertz/eegdata.html
- Nigam VP, Graupe D: A neural-network-based detection of epilepsy. Neurological Research 2004, 26(1):55-60. 10.1179/016164104773026534View ArticleGoogle Scholar
- Polat K, Günes S: Classification of epileptiform EEG using a hybrid system based on decision tree classifier and fast Fourier transform. Applied Mathematics and Computation 2007, 187(2):1017-1026. 10.1016/j.amc.2006.09.022MathSciNetView ArticleMATHGoogle Scholar
- Subasi A: EEG signal classification using wavelet feature extraction and a mixture of expert model. Expert Systems with Applications 2007, 32(4):1084-1093. 10.1016/j.eswa.2006.02.005View ArticleGoogle Scholar
- Tzallas AT, Tsipouras MG, Fotiadis DI: Automatic seizure detection based on time-frequency analysis and artificial neural networks. Computational Intelligence and Neuroscience 2007, 2007:-13.Google Scholar
- Ocak H: Optimal classification of epileptic seizures in EEG using wavelet analysis and genetic algorithm. Signal Processing 2008, 88(7):1858-1867. 10.1016/j.sigpro.2008.01.026MathSciNetView ArticleMATHGoogle Scholar
- Güler NF, Übeyli ED, Güler I: Recurrent neural networks employing Lyapunov exponents for EEG signals classification. Expert Systems with Applications 2005, 29(3):506-514. 10.1016/j.eswa.2005.04.011View ArticleGoogle Scholar
- Sadati N, Mohseni HR, Maghsoudi A: Epileptic seizure detection using neural fuzzy networks. Proceedings of IEEE International Conference on Fuzzy Systems, July 2006, Vancouver, Canada 596-600.Google Scholar
- Mousavi SR, Niknazar M, Vahdat BV: Epileptic seizure detection using AR model on EEG signals. Proceedings of Cairo International Biomedical Engineering Conference (CIBEC '08), December 2008, Cairo, Egypt 1-4.Google Scholar
- Übeyli ED, Güler I: Features extracted by eigenvector methods for detecting variability of EEG signals. Pattern Recognition Letters 2007, 28(5):592-603. 10.1016/j.patrec.2006.10.004View ArticleGoogle Scholar
- Güler I, Übeyli ED: Adaptive neuro-fuzzy inference system for classification of EEG signals using wavelet coefficients. Journal of Neuroscience Methods 2005, 148(2):113-121. 10.1016/j.jneumeth.2005.04.013View ArticleGoogle Scholar
- Güler I, Übeyli ED: Multiclass support vector machines for EEG-signals classification. IEEE Transactions on Information Technology in Biomedicine 2007, 11(2):117-126.View ArticleGoogle Scholar
- Tzallas AT, Tsipouras MG, Fotiadis DI: Epileptic seizure detection in EEGs using time-frequency analysis. IEEE Transactions on Information Technology in Biomedicine 2009, 13(5):703-710.View ArticleGoogle Scholar
- Pincus SM: Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences of the United States of America 1991, 88(6):2297-2301. 10.1073/pnas.88.6.2297MathSciNetView ArticleMATHGoogle Scholar
- Richman JS, Moorman JR: Physiological time-series analysis using approximate and sample entropy. American Journal of Physiology 2000, 278(6):H2039-H2049.Google Scholar
- Lin C-T, Wu R-C, Jung T-P, Liang S-F, Huang T-Y: Estimating driving performance based on EEG spectrum analysis. EURASIP Journal on Applied Signal Processing 2005, 2005(19):3165-3174. 10.1155/ASP.2005.3165View ArticleMATHGoogle Scholar
- Thakor NV, Tong S: Advances in quantitative electroencephalogram analysis methods. Annual Review of Biomedical Engineering 2004, 6: 453-495. 10.1146/annurev.bioeng.5.040202.121601View ArticleGoogle Scholar
- Kay SM: Modern Spectral Estimation: Theory and Application. Prentice Hall, Hall Englewood Cliffs, NJ, USA; 1988.MATHGoogle Scholar
- Mitchell M: An Introduction to Genetic Algorithms. MIT Press, Cambriduge, Mass, USA; 1988.MATHGoogle Scholar
- Golub G: Numerical methods for solving linear least squares problems. Numerische Mathematik 1965, 7(3):206-216. 10.1007/BF01436075MathSciNetView ArticleMATHGoogle Scholar
- Friedman LB: Regularized discriminant analysis. Journal of the American Statistical Association 1989, 84(405):165-175. 10.2307/2289860MathSciNetView ArticleGoogle Scholar
- Rumelhart DE, Hinton GE, Williams RJ: Learning representations by back-propagating errors. Nature 1986, 323(6088):533-536. 10.1038/323533a0View ArticleGoogle Scholar
- Cortes C, Vapnik V: Support-vector networks. Machine Learning 1995, 20(3):273-297.MATHGoogle Scholar
- Kuo B-C, Landgrebe DA: Nonparametric weighted feature extraction for classification. IEEE Transactions on Geoscience and Remote Sensing 2004, 42(5):1096-1105.View ArticleGoogle Scholar
- Lin C-T, Lin K-L, Ko L-W, Liang S-F, Kuo B-C, Chung I-F: Nonparametric single-trial EEG feature extraction and classification of driver's cognitive responses. EURASIP Journal on Applied Signal Processing 2008, 2008:-10.Google Scholar
- Lin C-T, Yeh C-M, Liang S-F, Chung J-F, Kumar N: Support-vector-based fuzzy neural network for pattern classification. IEEE Transactions on Fuzzy Systems 2006, 14(1):31-41.View ArticleGoogle Scholar
- Li X, Ouyang G, Richards DA: Predictability analysis of absence seizures with permutation entropy. Epilepsy Research 2007, 77(1):70-74. 10.1016/j.eplepsyres.2007.08.002View ArticleGoogle Scholar
- Shaw F-Z: Is spontaneous high-voltage rhythmic spike discharge in Long Evans rats an absence-like seizure activity? Journal of Neurophysiology 2004, 91(1):63-77.View ArticleGoogle Scholar
Copyright
This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.