UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Recklessly approximate sparse coding Denil, Misha

Abstract

Introduction of the so called “K-means” or “triangle” features in Coates, Lee and Ng, 2011 caused significant discussion in the deep learning community. These simple features are able to achieve state of the art performance on standard image classification benchmarks, outperforming much more sophisticated methods including deep belief networks, convolutional nets, factored RBMs, mcRBMs, convolutional RBMs, sparse autoencoders and several others. Moreover, these features are extremely simple and easy to compute. Several intuitive arguments have been put forward to describe this remarkable performance, yet no mathematical justification has been offered. In Coates and Ng, 2011, the authors improve on the triangle features with “soft threshold” features, adding a hyperparameter to tune performance, and compare these features to sparse coding. Both soft thresholding and sparse coding are found to often yield similar classification results, though soft threshold features are much faster to compute. The main result of this thesis is to show that the soft threshold features are realized as a single step of proximal gradient descent on a non-negative sparse coding objective. This result is important because it provides an explanation for the success of the soft threshold features and shows that even very approximate solutions to the sparse coding problem are sufficient to build effective classifiers.

Item Media

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International