Felix applies continuous relaxation on the space of programs and creates differentiable estimator of program latency, allowing efficient search of program candidates using gradient descent, in contrast to conventional approaches that search over a non-differentiable objective function over a discrete search space.
Apr 27, 2024
Felix applies continuous re- laxation on the space of programs and creates differentiable estimator of program latency, allowing efficient search of pro- gram ...
Felix is a gradient-based compiler optimization framework for tensor-based programs. It is designed to find optimization schedule in short amount of time.
Apr 27, 2024 · More Like this · Free Publicly Accessible Full Text · This content will become publicly available on April 27, 2025 · Conference Paper: https:// ...
In this work, we investigate the accelerated proximal gradient method for nonconvex programming (APGnc). The method compares between a usual proximal gradient ...
Proceedings of the 29th ACM International Conference on Architectural Support for Programming Languages and Operating Systems, Volume 3. , 2024, p.5-381 ,.
Nov 3, 2024 · This paper is included in the Proceedings of the 16th USENIX Symposium on Operating Systems Design and Implementation. Hongyu Zhu, Ruofan Wu, ...
Our work “Felix: Optimizing Tensor Programs with Gradient Descent” was accepted by ASPLOS 2024. Now you may just be able to optimize your DNN in minutes, ...
Felix: Optimizing Tensor Programs with Gradient Descent · ApproxCaliper: A Programmable Framework for Application-Aware Neural Network Optimization ...
People also ask
Is gradient descent an optimization?
Does TensorFlow use gradient descent?
Oct 8, 2024 · ... Tensor Programs Session Chair: Mangpo Phothilimthana (Google DeepMind) Papers Presented: 1. Felix: Optimizing Tensor Programs with Gradient ...