×
In this work, we propose a novel pyramid self-attention (PySA) mechanism which can collect global context information far more efficiently. Concretely, the ...
Oct 22, 2021 · In this paper, we propose a novel pyramid self-attention (PySA) mechanism which can collect global context information far more efficiently.
Oct 29, 2021 · In this paper, we propose a novel pyramid self-attention (PySA) mechanism which can collect global context information far more efficiently.
May 25, 2018 · A Pyramid Attention Network(PAN) is proposed to exploit the impact of global contextual information in semantic segmentation.
Missing: Self- | Show results with:Self-
Pyramid Convolutional Attention Network is proposed to efficiently capture long-range dependency and fuse features from different levels for benefitting ...
We adapt an attention module, termed efficient pyra- mid transformer, to fully exploit context modeling for semantic segmentation. • We introduce a spatial ...
Pyramid Self-attention for Semantic Segmentation · Jiyang Qi, Xinggang Wang, +2 authors. Wenyu Liu · Published in Chinese Conference on Pattern… 2021 · Computer ...
To improve the segmentation accuracy of the network for small objects, a feature pyramid module combined with an attention structure is introduced. This ...
This paper proposes a lightweight attention-guided semantic segmentation network (LAGNet) that adopts a joint group convolution pyramid strategy.
This paper firstly introduces the effectiveness of multi-scale context features and attention mechanisms in segmentation tasks. We find that multi-scale and ...