In this example we will use Optunity to optimize hyperparameters for a
support vector machine classifier (SVC) in scikit-learn. We will
construct a simple toy data set to illustrate that the response surface
can be highly irregular for even simple learning tasks (i.e., the
response surface is non-smooth, non-convex and has many local minima).

importoptunityimportoptunity.metrics# comment this line if you are running the notebookimportsklearn.svmimportnumpyasnpimportmathimportpandas%matplotlibinlinefrommatplotlibimportpylabaspltfrommpl_toolkits.mplot3dimportAxes3D

In order to use Optunity to optimize hyperparameters, we start by
defining the objective function. We will use 5-fold cross-validated area
under the ROC curve. We will regenerate folds for every iteration, which
helps to minimize systematic bias originating from the fold
partitioning.

We start by defining the objective function svm_rbf_tuned_auroc(),
which accepts \(C\) and \(\gamma\) as arguments.

The above plot shows the particles converge directly towards the
optimum. At this granularity, the response surface appears smooth.

However, a more detailed analysis reveals this is not the case, as shown
subsequently: - showing the sub trace with score up to 90% of the
optimum - showing the sub trace with score up to 95% of the optimum -
showing the sub trace with score up to 99% of the optimum

Clearly, this response surface is filled with local minima. This is a
general observation in automated hyperparameter optimization, and is one
of the key reasons we need robust solvers. If there were no local
minima, a simple gradient-descent-like solver would do the trick.