Skip to main content

Posts

Showing posts with the label Maximizing width in Svm

Study of Support Vector Machines

Introduction to support vectors In machine learning, support vector machines (SVMs, also support vector networks) are supervised learning models with associated learning algorithms that analyze data used for classification and regression analysis. What are support vectors Support vectors are the data points that lie closest to the decision surface (or hyperplane) • They are the data points most difficult to classify • They have direct bearing on the optimum location of the decision surface • We can show that the optimal hyperplane stems from the function class with the lowest “capacity”= # of independent features/parameters Theoretical concept SVMs maximize the margin (Winston terminology: the ‘street’) around the separating hyperplane.  • The decision function is fully specified by a (usually very small) subset of training samples, the support vectors.  • This becomes a Quadratic programming problem that is easy to solve by standard methods Separation by Hyperplanes • Assume linear