Contact

Dushyant Mehta

Research Interests

Computer Vision

Machine Learning and Neural Networks

Body Pose Estimation

Projects and Publications

On Implicit Filter Level Sparsity in Convolutional Neural NetworksD. Mehta;K.I. Kim; C. TheobaltComputer Vision and Pattern Recognition (CVPR) 2019
Features/filters are implicitly pruned in ReLU (and Leaky ReLU) convolutional neural networks due to a disproportionate influence of L2 or weight decay regularization, particularly with adaptive gradient descent methods. Various hyperparameters impact the extent of pruning/sparsity through direct or indirect interaction with the regularizer. Selective features are particularly susceptive to pruning, which makes the implicit sparsification a viable alternative to explicit neural network sparsification approaches, many of which employ a similar heuristic. Also, the provided insights would be useful to practitioners when comparing performance and generalization differences between adaptive and non-adaptive gradient descent approaches.
[paper]