×
Mar 5, 2020 · The idea of this paper is to decrease the confusion between categories to extract discriminative features and enlarge inter-class variance, ...
In this paper, we propose a loss function termed as Dynamic Attention Loss (DAL), which introduces confusion rate-weighted soft label (target) as the controller ...
A loss function termed as Dynamic Attention Loss (DAL), which introduces confusion rate-weighted soft label (target) as the controller of similarity ...
They introduce an attention loss with their ground truth attentions. (Cao et al., 2019) introduces an attention loss for small sample image classification tasks ...
To solve the small-sample classification problem, a deep contrastive learning network (DCLN) method is proposed in this paper.
This article proposes a new loss function called dynamic-recall focal loss (DRFL), which can solve the problem of imbalanced data categories in image ...
Jun 14, 2023 · This paper proposes an incremental learning method for small-sample malicious traffic classification. The method uses the pruning strategy to find the ...
For the small images (32 x 32), Attention-56 works very well and achieves 3.27% train error and 5.28% validation error spending 337 minutes on training.
This attention-driven approach enhances the model's ability to learn from imbalanced data, leading to improved classification accuracy for all classes,.
The proposed method uses a neural network as a classifier, which can dynamically classify different samples and preferably learn the similarity between features ...