Paper Reading AI Learner

Learning from Small Data Through Sampling an Implicit Conditional Generative Latent Optimization Model

2020-03-31 15:38:45
Idan Azuri, Daphna Weinshall

Abstract

We revisit the long-standing problem of \emph{learning from small sample}. In recent years major efforts have been invested into the generation of new samples from a small set of training data points. Some use classical transformations, others synthesize new examples. Our approach belongs to the second one. We propose a new model based on conditional Generative Latent Optimization (cGLO). Our model learns to synthesize completely new samples for every class just by interpolating between samples in the latent space. The proposed method samples the learned latent space using spherical interpolations (\emph{slerp}) and generates a new sample using the trained generator. Our empirical results show that the new sampled set is diverse enough, leading to improvement in image classification in comparison to the state of the art, when trained on small samples of CIFAR-100 and CUB-200.

Abstract (translated)

URL

https://arxiv.org/abs/2003.14297

PDF

https://arxiv.org/pdf/2003.14297