Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
PSP-Mal: Evading Malware Detection via Prioritized Experience-based Reinforcement Learning with Shapley Prior
ACSAC '23: Proceedings of the 39th Annual Computer Security Applications ConferencePages 580–593https://doi.org/10.1145/3627106.3627178With the widespread application of machine learning techniques in malware detection, researchers have proposed various adversarial attack methods to generate adversarial examples (AEs) of malware, thereby evading detection. Previous studies have shown ...
- research-articleNovember 2023
MalPatch: Evading DNN-Based Malware Detection With Adversarial Patches
IEEE Transactions on Information Forensics and Security (TIFS), Volume 19Pages 1183–1198https://doi.org/10.1109/TIFS.2023.3333567Static analysis is a crucial protection layer that enables modern antivirus systems to address the rampant proliferation of malware. These systems are increasingly relying on deep neural networks (DNNs) to automatically extract reliable features and ...
- research-articleApril 2023
Towards robust CNN-based malware classifiers using adversarial examples generated based on two saliency similarities
Neural Computing and Applications (NCAA), Volume 35, Issue 23Pages 17129–17146https://doi.org/10.1007/s00521-023-08590-1AbstractTargeted malware attacks are usually more purposeful and harmful than untargeted attacks, so it is important to perform the malware family classification. In classification tasks, convolutional neural networks (CNN) have shown superior ...
- research-articleApril 2023
AMGmal: Adaptive mask-guided adversarial attack against malware detection with minimal perturbation
Computers and Security (CSEC), Volume 127, Issue Chttps://doi.org/10.1016/j.cose.2023.103103AbstractModern cyber security systems increasingly rely on deep neural networks (DNNs) to tackle the surge in security threats. However, it is well-known that DNNs are vulnerable to adversarial examples. The state-of-the-art adversarial malware attacks ...
- research-articleNovember 2022
Adversarial attack via dual-stage network erosion
Computers and Security (CSEC), Volume 122, Issue Chttps://doi.org/10.1016/j.cose.2022.102888AbstractDeep neural networks are vulnerable to adversarial examples, which can fool deep models by adding subtle perturbations. Although existing attacks have achieved promising results, it still leaves a long way to go for generating ...