天天看點

SVM分割超平面的繪制與SVC.decision_function( )的功能

在李航老師的《統計學習方法》— 支援向量機那章有個例題:

樣本點 x1=(3,3),x2=(4,3),x3=(1,1),labels=(1,1,−1) ,求分割超平面?

  • 先說decision_function()的功能:計算樣本點到分割超平面的函數距離。
    • 沒錯,是函數距離(将幾何距離,進行了歸一化,具體看書)
    • 将 x1=(3,3),x2=(4,3),x3=(1,1),labels=(1,1,−1) 帶入決策函數decision_function( ),也就是分割超平面 12x1+12x2−2 即可得到函數距離:1, 1.5, -1,其中1 和 -1 剛好在margin邊緣上, x1,x3 也就是支援向量。
  • 以下是計算繪制分割超平面:
"""
=========================================
SVM: Maximum margin separating hyperplane
=========================================

Plot the maximum margin separating hyperplane within a two-class
separable dataset using a Support Vector Machine classifier with
linear kernel.
"""
print(__doc__)

import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm

# we create 40 separable points
np.random.seed()

X = np.array([[,],[,],[,]])
Y = np.array([,,-])

# fit the model
clf = svm.SVC(kernel='linear')
clf.fit(X, Y)

# get the separating hyperplane
w = clf.coef_[]
a = -w[] / w[]
xx = np.linspace(-, )
yy = a * xx - (clf.intercept_[]) / w[]

# plot the parallels to the separating hyperplane that pass through the
# support vectors
b = clf.support_vectors_[]
yy_down = a * xx + (b[] - a * b[])
b = clf.support_vectors_[-]
yy_up = a * xx + (b[] - a * b[])

# plot the line, the points, and the nearest vectors to the plane
plt.plot(xx, yy, 'k-')
plt.plot(xx, yy_down, 'k--')
plt.plot(xx, yy_up, 'k--')

plt.scatter(clf.support_vectors_[:, ], clf.support_vectors_[:, ],
            s=, facecolors='none')
plt.scatter(X[:, ], X[:, ], c=Y, cmap=plt.cm.Paired)

plt.axis('tight')
plt.show()

print clf.decision_function(X)
           
SVM分割超平面的繪制與SVC.decision_function( )的功能

繼續閱讀