eduzhai > Applied Sciences > Engineering >

Diverse Knowledge Distillation (DKD) A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks

  • Save

... pages left unread,continue reading

Document pages: 6 pages

Abstract: This paper proposes an ensemble learning model that is resistant toadversarial attacks. To build resilience, we introduced a training processwhere each member learns a radically distinct latent space. Member models areadded one at a time to the ensemble. Simultaneously, the loss function isregulated by a reverse knowledge distillation, forcing the new member to learndifferent features and map to a latent space safely distanced from those ofexisting members. We assessed the security and performance of the proposedsolution on image classification tasks using CIFAR10 and MNIST datasets andshowed security and performance improvement compared to the state of the artdefense methods.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×