[Research] [Research] A research paper of Professor Usaiman's Lab is approved by IJCAI 2023
- College of Software and Engineering
The paper "IMF: Integrating Matched Features Using Intellectual Logit in Knowledge Distillation" by DASH Laboratory (Advisor: Usaiman) Kim Jung-ho (Master degree in 2023) and Lee Han-bin (Master degree in 2022) will be published in the International Joint Conferences and Artificial Intelligence (JAI) in August 2023.
Knowledge distillation (KD) is an effective method for transferring the knowledge of a teacher model to a student model, that aims to improve the latter's performance efficiently. Although generic knowledge distillation methods such as softmax representation distillation and intermediate feature matching have demonstrated improvements with various tasks, only marginal improvements are shown in student networks due to their limited model capacity. In this work, to address the student model's limitation, we propose a novel flexible KD framework, Integrating Matched Features using Attentive Logit in Knowledge Distillation (IMF). Our approach introduces an intermediate feature distiller (IFD) to improve the overall performance of the student model by directly distilling the teacher's knowledge into branches of student models. The generated output of IFD, which is trained by the teacher model, is effectively combined by attentive logit. We use only a few blocks of the student and the trained IFD during inference, requiring an equal or less number of parameters. Through extensive experiments, we demonstrate that IMF consistently outperforms other state-of-the-art methods with a large margin over the various datasets in different tasks without extra computation.
- [Research] A Research paper of Professor Woo Hong-wook's laboratory (CSI laboratory) is approved by the ICML 2023