Stochastic Subset Selection for Efficient Training and Inference of Neural Networks

Bruno Andreis, A. Tuan Nguyen, Seanie Lee, Juho Lee, Eunho Yang, Sung Ju Hwang

Current machine learning algorithms are designed to work with huge volumes of high dimensional data such as images. However, these algorithms are being increasingly deployed to resource constrained systems such as mobile devices and embedded systems. Even in cases where large computing infrastructure is available, the size of each data instance, as well as datasets, can be a bottleneck in data transfer across communication channels. Also, there is a huge incentive both in energy and monetary terms in reducing both the computational and memory requirements of these algorithms. For nonparametric models that require to leverage the stored training data at inference time, the increased cost in memory and computation could be even more problematic. In this work, we aim to reduce the volume of data these algorithms must process through an end-to-end two-stage neural subset selection model. We first efficiently obtain a subset of candidate elements by sampling a mask from a conditionally independent Bernoulli distribution, and then autoregressivley construct a subset consisting of the most task relevant elements via sampling the elements from a conditional Categorical distribution. We validate our method on set reconstruction and classification tasks with feature selection as well as the selection of representative samples from a given dataset, on which our method outperforms relevant baselines. We also show in our experiments that our method enhances scalability of nonparametric models such as Neural Processes.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment