skip to main content
10.1145/3489517.3530488acmconferencesArticle/Chapter ViewAbstractPublication PagesdacConference Proceedingsconference-collections
research-article

You only search once: on lightweight differentiable architecture search for resource-constrained embedded platforms

Published: 23 August 2022 Publication History

Abstract

Benefiting from the search efficiency, differentiable neural architecture search (NAS) has evolved as the most dominant alternative to automatically design competitive deep neural networks (DNNs). We note that DNNs must be executed under strictly hard performance constraints in real-world scenarios, for example, the runtime latency on autonomous vehicles. However, to obtain the architecture that meets the given performance constraint, previous hardware-aware differentiable NAS methods have to repeat a plethora of search runs to manually tune the hyper-parameters by trial and error, and thus the total design cost increases proportionally. To resolve this, we introduce a lightweight hardware-aware differentiable NAS framework dubbed LightNAS, striving to find the required architecture that satisfies various performance constraints through a one-time search (i.e., you only search once). Extensive experiments are conducted to show the superiority of LightNAS over previous state-of-the-art methods. Related codes will be released at https://github.com/stepbuystep/LightNAS.

References

[1]
X. Zhang and et al. F-cad: A framework to explore hardware accelerators for codec avatar decoding. DAC, 2021.
[2]
X. Zhang and et al. Skynet: a hardware-efficient method for object detection and tracking on embedded systems. MLSys, 2020.
[3]
C. Wu and et al. Machine learning at facebook: Understanding inference at the edge. In HPCA, 2019.
[4]
H. Cai and et al. Proxylessnas: Direct neural architecture search on target task and hardware. ICLR, 2019.
[5]
B. Wu and et al. Fbnet: Hardware-aware efficient convnet design via differentiable neural architecture search. In CVPR, 2019.
[6]
B. Zoph and et al. Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578, 2016.
[7]
E. Real and et al. Regularized evolution for image classifier architecture search. In AAAI, 2019.
[8]
H. Liu and et al. Darts: Differentiable architecture search. ICLR, 2019.
[9]
S. Xie and et al. Snas: Stochastic neural architecture search. ICLR, 2019.
[10]
A. Vahdat and et al. Unas: Differentiable architecture search meets reinforcement learning. In CVPR, 2020.
[11]
X. Chu and et al. Fairnas: Rethinking evaluation fairness of weight sharing neural architecture search. In ICCV, 2021.
[12]
M. Tan and et al. Efficientnet: Rethinking model scaling for convolutional neural networks. In ICML, 2019.
[13]
S. Kim and et al. Mdarts: Multi-objective differentiable neural architecture search. In DATE, 2021.
[14]
M. Tan and et al. Mnasnet: Platform-aware neural architecture search for mobile. In CVPR, 2019.
[15]
M. Sandler and et al. Mobilenetv2: Inverted residuals and linear bottlenecks. In CVPR, 2018.
[16]
J. Hu and et al. Squeeze-and-excitation networks. In CVPR, 2018.
[17]
A. Howard and et al. Searching for mobilenetv3. In ICCV, 2019.
[18]
H. Cai and et al. Once-for-all: Train one network and specialize it for efficient deployment. ICLR, 2020.
[19]
E. Jang and et al. Categorical reparameterization with gumbel-softmax. arXiv preprint arXiv:1611.01144, 2016.
[20]
Y. Bengio and et al. Estimating or propagating gradients through stochastic neurons for conditional computation. arXiv preprint arXiv:1308.3432, 2013.
[21]
J. Deng and et al. Imagenet: A large-scale hierarchical image database. In CVPR, 2009.
[22]
W. Liu and et al. Ssd: Single shot multibox detector. In ECCV, 2016.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
DAC '22: Proceedings of the 59th ACM/IEEE Design Automation Conference
July 2022
1462 pages
ISBN:9781450391429
DOI:10.1145/3489517
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 August 2022

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

DAC '22
Sponsor:
DAC '22: 59th ACM/IEEE Design Automation Conference
July 10 - 14, 2022
California, San Francisco

Acceptance Rates

Overall Acceptance Rate 1,770 of 5,499 submissions, 32%

Upcoming Conference

DAC '25
62nd ACM/IEEE Design Automation Conference
June 22 - 26, 2025
San Francisco , CA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)46
  • Downloads (Last 6 weeks)9
Reflects downloads up to 14 Sep 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media