*2.1.2.7 Support vector machines (SVMs)*

One of the most well-liked supervised learning algorithms, Support Vector Machine (SVM) [17] is used to solve Classification and Regression problems.

However, it is largely employed in Machine Learning Classification issues. The SVM algorithm's objective is to establish the best line or decision boundary that can divide n-dimensional space into classes, allow to quickly classifying fresh data points in the future. A hyper plane is the name given to this optimal decision boundary.

SVM selects the extreme vectors and points that aid in the creation of the hyper plane. Support vectors, which are used to represent these extreme instances, form the basis for the SVM method. Take a look at the **Figures 12** and **13**, where two distinct categories are identified using decision boundary:

In n-dimensional space, there may be several lines or decision boundaries used to divide classes; however, the optimal decision boundary for classifying the data points must be identified. The hyperplane of SVM is a name for this optimal boundary. The features of dataset determine the dimensions of hyper plane, therefore, if there are just two features, as shown in **Figure 12** [17], the hyperplane will be a straight line. Additionally, if there are three features, the hyperplane will only have two dimensions.

Support Vectors: The data points or vectors that are closer to the hyperplane and influence the position and orientation of the hyperplane. The SVM method aids in identifying the ideal decision boundary or region, often known as a hyperplane. The SVM algorithm determines which line from each class is closest to the other. Support

**Figure 12.** *Support Vector Machine (SVM).*

vectors are the names for these points. Margin is the distance between the hyperplane and the vectors. Maximising this margin is the aim of SVM. The ideal hyperplane is the one with the largest margin.

SVM can be categorised in two different ways:

**Linear SVM**: Linear SVM is used for linearly separable records, which means that if a dataset may be categorised into instructions by the usage of a single straight line, then such facts is called as linearly separable statistics, and classifier is used called as Linear SVM classifier.

**Non-linear SVM**: Non-Linear SVM is used for non-linearly separated information, which means that if a dataset cannot be categorised by way of the use of a directly line, then such data is termed as non-linear statistics and classifier used is referred to as Non-linear SVM classifier.

Face identification, image classification, text categorisation, grouping of portrayals, bio-informatics, handwriting remembrance, etc. may all be done using the SVM method.
