Skip to content Skip to sidebar Skip to footer

Distillation, also known as model or knowledge distillation, is a process where knowledge is transferred from a large, complex AI ‘teacher’ model to a smaller and more efficient ‘student’ model.

Doing this creates a much smaller model file which, while keeping a lot of the teacher quality, significantly reduces the computing requirements.


error: Content is protected !!