**2.7 Model building**

The SVM method was used to train person EEG models using 10-fold crossvalidation training and was used to train models on the whole training set and test on a separate test set.

SVM developed by Cortes and Vapnik [24] is a practical implementation of statistical learning theory capable of processing difficult problems of supervised learning, SVM is nonprobabilistic classifier; the two limitations of SVM are linear and binary features [25].

A decision boundary (plane) in SVM is used to separate the feature vectors. SVM classifier finds during training into two classes. The problem is to find the decision boundary (a linear hyperplane) that has the maximum separation (margin) between the two classes. The margin of a hyperplane is the distance between parallel equidistant hyperplanes on either side of the hyperplane such that the gap is void of data objects. The optimization during training finds a hyperplane that has the maximum margin. The SVM then uses that hyperplane to predict the class of a new data object once presented with its feature vector. See **Figure 4** [26].
