Specifically, in order to accelerate the convergence speed, we propose the first poisoning attack algorithm by employing momentum algorithm. Also, we propose the second poisoning attack algorithm by utilizing adam algorithm, which can get rid of some local optimums and has a faster convergence speed simultaneously.
Feb 15, 2021
Oct 8, 2024 · Current attack methods involve iteratively retraining a surrogate recommender on the poisoned data with the latest fake users to optimize the ...
This work proposes a novel poisoning algorithm based on the idea of back-gradient optimization, able to target a wider class of learning algorithms, ...
Sep 15, 2019 · Momentum or SGD with momentum is a method which helps accelerate gradients vectors in the right directions, thus leading to faster converging.
From adversarial examples to data poisoning instances: utilizing an adversarial attack method to poison a transfer learning model · Computer Science. ICC 2022 - ...
Jul 3, 2024 · NAI-FGM (Chen et al., 2022) is a gradient-based attack algorithm, which applies Nesterov momentum and Adam to iterative attacks to improve its ...
Missing: Poisoning | Show results with:Poisoning
[PDF] Accelerated Federated Learning with Decoupled Adaptive Optimization
proceedings.mlr.press › ...
Tesseract: Gradient flip score to secure fed- erated learning against model poisoning attacks. ... The idea is to treat P as the momentum of the global Adam ...
We present batch-order poisoning (BOP) and batch-order backdooring (BOB) – the first poison and backdoor strategies that do not rely on adding adversarial ...
May 22, 2018 · ... by machine learning algorithms. In this paper, we perform the first systematic study of poisoning attacks and their countermeasures for ...
Oct 10, 2022 · These adaptive algorithms, e.g., Adam and AdamW, often offer faster convergence speed than SGD across many DNN frameworks. Previous research has ...