Perstando et Praestando

A Fake Convolutional Neural Network

This is the early version of my CNN, at that time, I incorrectly thought that I can just use some randomly chosen Gabor filters to do the convolution, so I wrote this. Actually, the test result is not bad for simple datasets such as MNIST, I think it’s just a fake CNN, but a nice deep network, which convolves the images with randomly chosen Gabor filters and pooling, then train use regular deep network. The convolution and pooling parts can be seen as kind of pre-processing.

Thank you for your great work! It is very impressive.
But I was wondering why your error rate is higher than the 99% reported by Yann LeCun. Is it just because you did not think it was necessary to keep fine tuning your codes? Or your algorithm is essentially different (If so, there must be some benefit, i.e. low cost or shorter runtime).