×
In this paper, we present a low-latency sparse-Winograd CNN accelerator (LSW-CNN) for pruned Wino-grad CNN models.
In this paper, we present a low-latency sparse-. Winograd CNN accelerator (LSW-CNN) for pruned Wino- grad CNN models. The ReLU-modified algorithm is em- ployed ...
A low-latency sparse-Winograd CNN accelerator (LSW-CNN) for pruned Wino-grad CNN models and a novel fast mask indexing algorithm for sparse data compression ...
Haonan Wang, Wenjian Liu, Tianyi Xu, Jun Lin, Zhongfeng Wang: A Low-latency Sparse-Winograd Accelerator for Convolutional Neural Networks.
May 4, 2019 · A low-latency sparse-winograd accelerator for convolutional neural networks. Haonan Wang, Haonan Wang, Wenjian Liu, Tianyi Xu, Jun Lin, Zhongfeng Wang.
People also ask
A Low-latency Sparse-Winograd Accelerator for Convolutional Neural Networks. H Wang, W Liu, T Xu, J Lin, Z Wang. ICASSP 2019-2019 IEEE International Conference ...
A LOW-LATENCY SPARSE-WINOGRAD ACCELERATOR FOR CONVOLUTIONAL NEURAL NETWORKS. Manuscript Link: Click here to view manuscript on IEEE Xplore. Authors: Haonan ...
A Low-latency Sparse-Winograd Accelerator for Convolutional Neural Networks. Haonan WangWenjian LiuTianyin XuJun LinZhongfeng Wang. Computer Science. ICASSP ...
Our Winols accelerator outperforms dense accelerator by a factor of 31.7 × in inference latency. When compared with prevailing sparse Winograd accelerators, ...
Thus, for a pruned network, Winograd's algorithm actually increases the number of multiplies; the loss of sparsity more than offsets the reduced operation count ...
Missing: Low- latency