**3. Epileptic seizure prediction and detection**

There is a link between data analysis and soft computing. Data may be qualitative or quantitative. Quantitative data can give exact solution for the problem. The data are pre-processed once they have been collected. The raw data are transformed effectively for the purpose of analysis in the pre-processing stage. Any type of data has to be initially pre-processed for analysis. The main principle of data pre-processing is to eliminate the irrelevant and redundant data (noise data) in order to get better detection accuracy of the system. In signal processing, the error is referred to as an artifact or noise. Unwanted information can be removed from the raw data using noise reduction. Different types of algorithms are available for data pre-processing. For example, in the case of EEG signal processing for epileptic seizure detection, artifacts can occur from physiological or mechanical sources. Respiratory, cardiac/pulse, eye movement, and electromyography signals are biological artifacts [4]. These artifacts should be recognized and eliminated for proper diagnosis. More than one variety of artifacts can appear in the recorded EEG. Preprocessing is the first step in classification and diagnostics where the artifacts have to be removed. After pre-processing, the signals are filtered and free from noise. These filtered signals are used for feature extraction process in the next step.

### **3.1 Feature extraction/selection and classification**

The process that converts the huge samples to a set of features is called feature extraction and feature selection is the process that filters the redundant or irrelevant features. These methods are used to reduce the actual dimension of the given data.

Data are important to build a machine learning model. The performance of the classifier depends on the given data. The noise must be removed from data. Classifier cannot separate the noise from data. Pre-processing is the process that is most important for removing noise. Analysis of EEG signals is important to diagnose epilepsy in clinical practice [3]. Fourier transform-based analysis is suitable for stationary signals. Studies have proved that EEG signals change over time and frequency components. Several time-frequency domain-based methods such as short time Fourier transform, discrete wavelet transform (DWT), and multiwavelet transform can be used to decompose the EEG signals [5]. Removing artifacts from the signal especially in biomedical applications is a challenging task, because it creates some signals and disturbs the epilepsy diagnosis. Pre-processing is the process to remove artifacts, and they can be extracted well by a method called independent component analysis (ICA) [6]. In order to reduce the dimension of the raw data and to find optimal solution, feature extraction process with kernel trick is frequently used [7]. **Figure 2** explains the EEG signal classification.

In earlier days, reading and interpretations of the EEG signals were very difficult for a neurophysiologist. This drawback has been overcome in the latest computer technology. EEG is a non-stationary signal and is very difficult to understand by an ordinary person. For EEG signal analysis, features are extracted from the EEG vectors and appropriate features are selected for classification. Feature selection is a subset of feature extraction. The irrelevant and redundant features are eliminated for better performance of the system. Feature selection algorithms can be used to select

**201**

*Components of Soft Computing for Epileptic Seizure Prediction and Detection*

appropriate features. Genetic algorithm is an exact tool for feature selection. It can reduce the computing time and space required to run the algorithms. Filter method, Pearson's correlation co-efficient, mutual information, wrapper methods, and greedy forward search are some of the methods used to select features for classification. In machine learning, classification is the process of categorizing the data by training the machine with the class label. For example, labels like "Seizure" or "Normal" are used in the case of supervised learning. The clustering technique also known as grouping technique is based on inherence in unsupervised learning and can handle unlabeled data. An algorithm that maps the data into a particular group is called a classifier.

ECG and EEG data are used in seizure detection. Several electronic mobile applications are developed to track seizure information from the patient electronically. The information includes type of seizure, frequency, and duration. The application provides useful data for the epileptologist to treat epilepsy accurately. Already, many applications have been developed and are available on the market. **Figure 3**

A new high tech bracelet developed by Netherlands scientists can detect 85% of all severe night time epilepsy seizures. Automated seizure detection methods can overcome some of the difficulties that occur from data collection, patient monitoring, and prediction modeling. Closed-loop system monitors the seizures and can detect, anticipate, and even respond to the real-time information from the patients. These systems have been used in emergency and intensive care settings of medical diagnosis [8].

B. Suguna Nanthini [3] had carried out six different analyses for detecting seizures using EEG signals under supervised learning method. The performance of the system in all the analyses is measured by the confusion matrix method. Online available EEG database (Bonn University Database) and real-time data from the EEG center, Coimbatore, India, are used for EEG signal classification analysis. EEG tests taken from 10 normal and seizure subjects for epileptic seizure detection are

*DOI: http://dx.doi.org/10.5772/intechopen.83413*

**3.2 Warning system in epilepsy**

**Figure 3.**

*Warning system in epilepsy.*

**4. Review of EEG signal analyses**

represents the closed-loop warning system for epilepsy.

**Figure 2.** *EEG signal classification.*

*Components of Soft Computing for Epileptic Seizure Prediction and Detection DOI: http://dx.doi.org/10.5772/intechopen.83413*

*Epilepsy - Advances in Diagnosis and Therapy*

Data are important to build a machine learning model. The performance of the classifier depends on the given data. The noise must be removed from data. Classifier cannot separate the noise from data. Pre-processing is the process that is most important for removing noise. Analysis of EEG signals is important to diagnose epilepsy in clinical practice [3]. Fourier transform-based analysis is suitable for stationary signals. Studies have proved that EEG signals change over time and frequency components. Several time-frequency domain-based methods such as short time Fourier transform, discrete wavelet transform (DWT), and multiwavelet transform can be used to decompose the EEG signals [5]. Removing artifacts from the signal especially in biomedical applications is a challenging task, because it creates some signals and disturbs the epilepsy diagnosis. Pre-processing is the process to remove artifacts, and they can be extracted well by a method called independent component analysis (ICA) [6]. In order to reduce the dimension of the raw data and to find optimal solution, feature extraction process with kernel trick is frequently used [7]. **Figure 2** explains the EEG signal classification. In earlier days, reading and interpretations of the EEG signals were very difficult for a neurophysiologist. This drawback has been overcome in the latest computer technology. EEG is a non-stationary signal and is very difficult to understand by an ordinary person. For EEG signal analysis, features are extracted from the EEG vectors and appropriate features are selected for classification. Feature selection is a subset of feature extraction. The irrelevant and redundant features are eliminated for better performance of the system. Feature selection algorithms can be used to select

**200**

**Figure 2.**

*EEG signal classification.*

appropriate features. Genetic algorithm is an exact tool for feature selection. It can reduce the computing time and space required to run the algorithms. Filter method, Pearson's correlation co-efficient, mutual information, wrapper methods, and greedy forward search are some of the methods used to select features for classification. In machine learning, classification is the process of categorizing the data by training the machine with the class label. For example, labels like "Seizure" or "Normal" are used in the case of supervised learning. The clustering technique also known as grouping technique is based on inherence in unsupervised learning and can handle unlabeled data. An algorithm that maps the data into a particular group is called a classifier.

### **3.2 Warning system in epilepsy**

ECG and EEG data are used in seizure detection. Several electronic mobile applications are developed to track seizure information from the patient electronically. The information includes type of seizure, frequency, and duration. The application provides useful data for the epileptologist to treat epilepsy accurately. Already, many applications have been developed and are available on the market. **Figure 3** represents the closed-loop warning system for epilepsy.

A new high tech bracelet developed by Netherlands scientists can detect 85% of all severe night time epilepsy seizures. Automated seizure detection methods can overcome some of the difficulties that occur from data collection, patient monitoring, and prediction modeling. Closed-loop system monitors the seizures and can detect, anticipate, and even respond to the real-time information from the patients. These systems have been used in emergency and intensive care settings of medical diagnosis [8].
