k-Fold Cross-Validating Neural Networks

20 Dec 2017

If we have smaller data it can be useful to benefit from k-fold cross-validation to maximize our ability to evaluate the neural network’s performance. This is possible in Keras because we can “wrap” any neural network such that it can use the evaluation features available in scikit-learn, including k-fold cross-validation. To accomplish this, we first have to create a function that returns a compiled neural network. Next we use KerasClassifier (if we have a classifier, if we have a regressor we can use KerasRegressor) to wrap the model so it can be used by scikit-learn. After this, we can use our neural network like any other scikit-learn learning algorithm (e.g. random forests, logistic regression). In our solution, we used cross_val_score to run a 3-fold cross-validation on our neural network.

Create Feature And Target Data

# Number of featuresnumber_of_features=100# Generate features matrix and target vectorfeatures,target=make_classification(n_samples=10000,n_features=number_of_features,n_informative=3,n_redundant=0,n_classes=2,weights=[.5,.5],random_state=0)