![]() Runtime experiments on LeNet, Resnet, YOLOv2 and YOLOv4tiny DNN models show Int-Monitor can successfully attack the FPGA-based DNN accelerator SoCs. As result, the network forward propagation will be invalid and the accelerator will deny service. By attacking the global bias buffer, this trojan can prevent the activation of neurons in a DNN model. By implanting a well-designed interrupt monitor between the host processor and DNN accelerator, this backdoor can capture specific DNN models and trigger the trojan to attack DNN bias buffers. In this paper, we propose Int-Monitor, a novel neural network model triggered hardware trojan in DNN accelerators. By snooping on the interrupt signal patterns, we can capture specific deep neural network (DNN) models and launch hardware trojan attacks. ![]() Neural network reuse PE resources layer by layer, after a layer finished, accelerator will give an interrupt to inform host processor dispatch the next layer. The experimental results demonstrate the feasibility and effectiveness of the secure architecture of DNN accelerators with negligible performance overhead.ĭeep learning accelerators have domain-specific architectures, this special memory hierarchy and working mode could bring about new crucial security vulnerabilities. ![]() To protect general DNN accelerator from being attacked by model inversion attack, this paper proposes a secure and general architecture called NPUFort, which guarantees the confidentiality of the parameters of DNN model and mitigates side-channel information leakage. Furthermore, the structure of DNN model running on the accelerator is acquired by the side channel information and interrupt status register. The insecure design flaws of existing DNN accelerators can be exploited to recover the structure of DNN model from the plain instructions, thus the runtime environment can be controlled to obtain the weights of DNN model. Hence, they are suffering from the security risk of being attacked. DNN accelerators are not designed with security in mind, but for higher performance and lower energy consumption. ![]() ![]() Deep neural network (DNN) models are widely used for inference in many application scenarios. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |