Get Your Foes Fooled: Proximal Gradient Split Learning for Defense Against Model Inversion Attacks on IoMT Data

KHOWAJA, SA, LEE, IH, DEV, K, JARWAR, Muhammad Aslam and QURESHI, NMF (2022). Get Your Foes Fooled: Proximal Gradient Split Learning for Defense Against Model Inversion Attacks on IoMT Data. IEEE Transactions on Network Science and Engineering.

[img]
Preview
PDF
Get_Your_Foes_Fooled_Proximal_Gradient_Split_Learning_for_Defense_Against_Model_Inversion_Attacks_on_IoMT_Data.pdf - Accepted Version
All rights reserved.

Download (2MB) | Preview
Official URL: https://ieeexplore.ieee.org/document/9817817
Link to published version:: https://doi.org/10.1109/TNSE.2022.3188575

Abstract

The past decade has seen a rapid adoption of Artificial Intelligence (AI), specifically the deep learning networks, in Internet of Medical Things (IoMT) ecosystem. However, it has been shown recently that the deep learning networks can be exploited by adversarial attacks that not only make IoMT vulnerable to the data theft but also to the manipulation of medical diagnosis. The existing studies consider adding noise to the raw IoMT data or model parameters which not only reduces the overall performance concerning medical inferences but also is ineffective to the likes of deep leakage from gradients method. In this work, we propose proximal gradient split learning (PSGL) method for defense against the model inversion attacks. The proposed method intentionally attacks the IoMT data when undergoing the deep neural network training process at client side. We propose the use of proximal gradient method to recover gradient maps and a decision-level fusion strategy to improve the recognition performance. Extensive analysis show that the PGSL not only provides effective defense mechanism against the model inversion attacks but also helps in improving the recognition performance on publicly available datasets. We report 14.0 % , 17.9 % , and 36.9 % gains in accuracy over reconstructed and adversarial attacked images, respectively.

Item Type: Article
Identification Number: https://doi.org/10.1109/TNSE.2022.3188575
SWORD Depositor: Symplectic Elements
Depositing User: Symplectic Elements
Date Deposited: 07 Sep 2022 11:17
Last Modified: 12 Oct 2023 10:30
URI: https://shura.shu.ac.uk/id/eprint/30679

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics