What is Knowledge Distillation?
Knowledge Distillation is a technique where a larger, more complex model, known as the teacher model, transfers its knowledge to a smaller, more efficient model, known as the student model. It helps compress and transfer the knowledge from the teacher to the student model.