Plot decision surface of multinomial and One-vs-Rest Logistic Regression.
The hyperplanes corresponding to the three One-vs-Rest (OVR) classifiers
are represented by the dashed lines.

Out:

training score : 0.995 (multinomial)
training score : 0.976 (ovr)

print(__doc__)# Authors: Tom Dupre la Tour <tom.dupre-la-tour@m4x.org># License: BSD 3 clauseimportnumpyasnpimportmatplotlib.pyplotaspltfromsklearn.datasetsimportmake_blobsfromsklearn.linear_modelimportLogisticRegression# make 3-class dataset for classificationcenters=[[-5,0],[0,1.5],[5,-1]]X,y=make_blobs(n_samples=1000,centers=centers,random_state=40)transformation=[[0.4,0.2],[-0.4,1.2]]X=np.dot(X,transformation)formulti_classin('multinomial','ovr'):clf=LogisticRegression(solver='sag',max_iter=100,random_state=42,multi_class=multi_class).fit(X,y)# print the training scoresprint("training score : %.3f (%s)"%(clf.score(X,y),multi_class))# create a mesh to plot inh=.02# step size in the meshx_min,x_max=X[:,0].min()-1,X[:,0].max()+1y_min,y_max=X[:,1].min()-1,X[:,1].max()+1xx,yy=np.meshgrid(np.arange(x_min,x_max,h),np.arange(y_min,y_max,h))# Plot the decision boundary. For that, we will assign a color to each# point in the mesh [x_min, x_max]x[y_min, y_max].Z=clf.predict(np.c_[xx.ravel(),yy.ravel()])# Put the result into a color plotZ=Z.reshape(xx.shape)plt.figure()plt.contourf(xx,yy,Z,cmap=plt.cm.Paired)plt.title("Decision surface of LogisticRegression (%s)"%multi_class)plt.axis('tight')# Plot also the training pointscolors="bry"fori,colorinzip(clf.classes_,colors):idx=np.where(y==i)plt.scatter(X[idx,0],X[idx,1],c=color,cmap=plt.cm.Paired,edgecolor='black',s=20)# Plot the three one-against-all classifiersxmin,xmax=plt.xlim()ymin,ymax=plt.ylim()coef=clf.coef_intercept=clf.intercept_defplot_hyperplane(c,color):defline(x0):return(-(x0*coef[c,0])-intercept[c])/coef[c,1]plt.plot([xmin,xmax],[line(xmin),line(xmax)],ls="--",color=color)fori,colorinzip(clf.classes_,colors):plot_hyperplane(i,color)plt.show()