Knowledge distillation (KD) has recently emerged as a powerful strategy to transfer knowledge from pre-trained teacher model lightweight student, and demonstrated its unprecedented success over wide spectrum of applications. In spite the encouraging results, KD process \emph{per se} poses potential threat network ownership protection, since contained in can be effortlessly distilled hence expos...