A Studious Approach to Semi-Supervised Learning

Published in ICBINB, NeurIPS 2021, 2021

Sahil Khose, Shruti Jain, V Manushree

This paper is an ablation study of distillation in a semi-supervised setting, which not just reduces the number of parameters of the model but can achieve this while improving the performance over the baseline supervised model and making it better at generalizing. We find that the fewer the labels, the more this approach benefits from a smaller student network. This brings forward the potential of distillation as an effective solution to enhance performance in semi-supervised computer vision tasks while maintaining deployability.