Ptolemy: Architecture Support for Robust Deep Learning

23 Aug 2020  ·  Yiming Gan, Yuxian Qiu, Jingwen Leng, Minyi Guo, Yuhao Zhu ·

Deep learning is vulnerable to adversarial attacks, where carefully-crafted input perturbations could mislead a well-trained Deep Neural Network to produce incorrect results. Today's countermeasures to adversarial attacks either do not have capability to detect adversarial samples at inference time, or introduce prohibitively high overhead to be practical at inference time. We propose Ptolemy, an algorithm-architecture co-designed system that detects adversarial attacks at inference time with low overhead and high accuracy.We exploit the synergies between DNN inference and imperative program execution: an input to a DNN uniquely activates a set of neurons that contribute significantly to the inference output, analogous to the sequence of basic blocks exercised by an input in a conventional program. Critically, we observe that adversarial samples tend to activate distinctive paths from those of benign inputs. Leveraging this insight, we propose an adversarial sample detection framework, which uses canary paths generated from offline profiling to detect adversarial samples at runtime. The Ptolemy compiler along with the co-designed hardware enable efficient execution by exploiting the unique algorithmic characteristics. Extensive evaluations show that Ptolemy achieves higher or similar adversarial example detection accuracy than today's mechanisms with a much lower runtime (as low as 2%) overhead.

PDF Abstract

Categories


Hardware Architecture Signal Processing