天天看点

SVM分割超平面的绘制与SVC.decision_function( )的功能

在李航老师的《统计学习方法》— 支持向量机那章有个例题:

样本点 x1=(3,3),x2=(4,3),x3=(1,1),labels=(1,1,−1) ,求分割超平面?

  • 先说decision_function()的功能:计算样本点到分割超平面的函数距离。
    • 没错,是函数距离(将几何距离,进行了归一化,具体看书)
    • 将 x1=(3,3),x2=(4,3),x3=(1,1),labels=(1,1,−1) 带入决策函数decision_function( ),也就是分割超平面 12x1+12x2−2 即可得到函数距离:1, 1.5, -1,其中1 和 -1 刚好在margin边缘上, x1,x3 也就是支持向量。
  • 以下是计算绘制分割超平面:
"""
=========================================
SVM: Maximum margin separating hyperplane
=========================================

Plot the maximum margin separating hyperplane within a two-class
separable dataset using a Support Vector Machine classifier with
linear kernel.
"""
print(__doc__)

import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm

# we create 40 separable points
np.random.seed()

X = np.array([[,],[,],[,]])
Y = np.array([,,-])

# fit the model
clf = svm.SVC(kernel='linear')
clf.fit(X, Y)

# get the separating hyperplane
w = clf.coef_[]
a = -w[] / w[]
xx = np.linspace(-, )
yy = a * xx - (clf.intercept_[]) / w[]

# plot the parallels to the separating hyperplane that pass through the
# support vectors
b = clf.support_vectors_[]
yy_down = a * xx + (b[] - a * b[])
b = clf.support_vectors_[-]
yy_up = a * xx + (b[] - a * b[])

# plot the line, the points, and the nearest vectors to the plane
plt.plot(xx, yy, 'k-')
plt.plot(xx, yy_down, 'k--')
plt.plot(xx, yy_up, 'k--')

plt.scatter(clf.support_vectors_[:, ], clf.support_vectors_[:, ],
            s=, facecolors='none')
plt.scatter(X[:, ], X[:, ], c=Y, cmap=plt.cm.Paired)

plt.axis('tight')
plt.show()

print clf.decision_function(X)
           
SVM分割超平面的绘制与SVC.decision_function( )的功能

继续阅读