ICYMI: ZDNET highlights how we're tackling the challenge of running AI models on energy-constrained devices. With an approach that combines analog and digital computing, our chip has massive potential for dramatically improving the energy efficiency of generative AI. Learn how we're paving the way for AI deployment at scale in resource-constrained environments. https://lnkd.in/eDKvUxYW #AIChips #AI #GenAI
EnCharge AI
Embedded Software Products
Santa Clara, California 2,129 followers
Where the future of AI compute is being defined and built, to unlock new levels of machine intelligence.
About us
EnCharge AI is a leader in advanced hardware and software systems for AI computing. EnCharge’s robust and scalable next-generation in-memory computing technology provides orders-of-magnitude higher compute efficiency and density compared to today’s best-in-class solutions, at a fraction of the cost. The high-performance architecture is coupled with seamless software and will enable the immense potential of AI to be accessible in power -, energy-, and space-constrained applications. EnCharge AI launched in 2022 and is led by veteran technologists with backgrounds in semiconductor design and AI systems.
- Website
-
https://enchargeai.com/
External link for EnCharge AI
- Industry
- Embedded Software Products
- Company size
- 11-50 employees
- Headquarters
- Santa Clara, California
- Type
- Privately Held
- Founded
- 2022
Locations
-
Primary
4500 Great American Parkway
Suite 230
Santa Clara, California 95054, US
Employees at EnCharge AI
Updates
-
Our AI chip is differentiated with exclusive technologies and has scalability demonstrated across five generations of silicon. At EnCharge AI, we’ve built the world’s most efficient AI processor, achieving 150 TOPS/W with an 8-bit MAC energy efficiency (16nm chip) today and projected to achieve 375 TOPS with an 8-bit MAC energy efficiency in 2025+ (5 nm chip). Learn how we're leading the way in advanced hardware and software to drive forward the next step in AI computing. https://lnkd.in/ezXC_anb #AI #computing #AIChips
EnCharge AI
enchargeai.com
-
Hear from EnCharge AI CEO Naveen Verma in this Deep Tech Musings podcast episode hosted by Pronojit Saha. He discusses today's fundamental computing challenges that are pressing the need to develop new technologies, which will pave major advancements for the AI industry. https://lnkd.in/gQdnaR4n #AI #computing #AIChips
DTM E58. The Future of AI Chips - Dr. Naveen Verma, EnCharge.ai
pronojits.substack.com
-
How will the growth of AI will create unprecedented compute? Join EnCharge AI Co-Founder and CTO Kailash Gopalakrishnan at Collision Conf in Toronto as he delves into this topic, discussing the founding of EnCharge AI and collaborative efforts needed from the public and private sectors in solving compute. https://bit.ly/3RsB0Sv #CollisionConf #AI #compute
-
EnCharge AI CEO Naveen Verma dives into building complex, powerful chips and developing a complete software stack that provides organizations with a comprehensive, collaborative process for pushing the boundaries of computing. Learn more from AIM Research. https://bit.ly/42KUJBx #EnChargeAI #AIChips #software #hardware #computing
-
With the highest efficiency reported for AI computing to date, EnCharge AI's test chips and hardware can achieve over 150 TOPS/W for 8-b computing (20x the efficiency of top AI chips on the market). Explore how we're delivering breakthrough advances in performance, cost, and sustainability with fully validated hardware and flexible software. https://bit.ly/49MhEyc
EnCharge AI
enchargeai.com
-
EnCharge AI CEO Naveen Verma sheds light in Axios about our use of capacitors instead of semiconductor devices "to make analog very precise, robust and scalable." Learn how our industry is taking on the unique challenges of applying AI in space and how we can help drive these capabilities. https://bit.ly/4atw8Dn #AIChips #space #AI #SpaceTechnology
Space industry races to put AI in orbit
axios.com
-
Delve into the latest Deep Tech Musings podcast episode hosted by Pronojit Saha and featuring EnCharge AI CEO Naveen Verma as he discusses his journey from researcher to founder, why current computing technologies fall short for extreme workloads, and how our technology is revolutionizing AI. https://spoti.fi/3ykFVOO #AI #DeepTech #computing #podcast
-
“A lot of the challenges that we see in the data center will be overcome,” says EnCharge AI CEO Naveen Verma. “I expect to see a big focus on the edge. I think it’s going to be critical to getting AI at scale.” Learn more about how our technology is contributing to the growing interest in edge computing for AI from MIT Technology Review's James O'Donnell: https://bit.ly/3UI319u #AI #EdgeComputing #chips
What’s next in chips
technologyreview.com
-
DARPA's OPTIMA program aims to develop ultra-efficient AI chips that use less electricity per computation compared to current chips in the market. Instead of utilizing current to transmit an analog signal, we have found a way to use charge via capacitors, which overcomes the signal-to-noise limitations that have been a barrier to dramatic analog computing efficiency. “We’ve known for years analog can be a hundred times more efficient than digital, [but] analog is noisy, it’s not reliable, it’s not scalable,” said our CEO Naveen Verma. “That’s the big problem that we solved.” Read the full article. https://bit.ly/3y6SRHR #AIChips #capacitors #AI
DARPA's OPTIMA program seeks ultra-efficient AI chips - Breaking Defense
breakingdefense.com