🎉 LM Studio v0.3.3 for Mac, Linux, and Windows is available now! Read the release notes: https://lnkd.in/eAv3Qg-C Download LM Studio: https://lnkd.in/ewr6Jt7p
About us
New in LM Studio v0.3.0: Chat with local documents! Download from https://lmstudio.ai
- Website
-
https://lmstudio.ai
External link for LM Studio
- Industry
- Desktop Computing Software Products
- Company size
- 2-10 employees
- Headquarters
- Brooklyn
- Type
- Privately Held
Locations
-
Primary
Brooklyn, US
Employees at LM Studio
Updates
-
LM Studio reposted this
Day 1 success! Llama-3.2 3B ran smoothly out of the box on an #AMD Ryzen AI #laptop processor with integrated #Radeon GPU. 👀 Watch as it’s solving a #physics problem real-time (in less than 7 seconds). #Endpoint #AI just leveled up! Thanks to our partner LM Studio for turning around this demo within 4h of the llama-3.2 launch!! 🚀
-
We just published a new documentation website 📚. Learn how to download LLMs, run a Local Server, and more. Check it out and let us know what you think: https://lmstudio.ai/docs
-
Introducing LM Studio 0.3.0 - discover, download and run local LLMs! LM Studio is the easiest way to run LLMs locally on your computer. Within minutes you can be chatting with the leading open-source models like Llama 3.1, Gemma 2, and Phi 3. And there’s never been a better time to get started. From an entirely refreshed interface, to chatting with documents using RAG, to support for new languages, this is a whole new LM Studio. What’s Included: - Built-in Chat with Documents (aka RAG) 📑 - Total UI overhaul (with all the same customization features you know & love) 🎨 - Automatic GPU detection + offload 🎛️ - UI now also available in Spanish, German, French, Norwegian, Turkish, & Russian 🗺️ - Conversation management (folders, notes, chat cloning + branching) 📁 - OpenAI-compatible Structured Output API 🛠️ - Load & serve multiple LLMs on the network 👯 .. and tons more! 👾
-
Every day is ARM day 🦾 (on Windows) 🎉 LM Studio 0.2.29 is now updated with Llama 3.1 support! We’ve been working with our friends at Qualcomm to ensure speedy tokens / second on Snapdragon X Elite CPUs. 🐉🔥 Llama 3.1 is Meta’s latest openly available LLM, supporting up to 128K token context length, multi-lingual support, and advanced reasoning. Get LM Studio for Windows (ARM64): https://lnkd.in/eb-vkv5K
-
LM Studio reposted this
Here’s what happened This Week in #AI: 🔵 Qualcomm announced that LM Studio now supports Microsoft Windows on #Snapdragon, enabling users to run the latest LLMs all on-device: https://bit.ly/464GGsh 🔵 CCS Insight Executive Chairman Shaun Collins shared his conversation with Qualcomm CFO & COO Akash Palkhiwala the impact AI is making across industries: https://bit.ly/3XNf34G 🔵 ZDNET shared three takeaways from Qualcomm’s AI Media & Analyst Workshop: https://zd.net/45Q1YJJ
-
LM Studio reposted this
On-device #AI milestone: This week at Qualcomm’s headquarters in San Diego, LM Studio Founder & CEO Yagil Burowski ran Llama 3 on a #Snapdragon X Elite powered laptop, achieving 20 tokens per second. With LM Studio now supporting Microsoft Windows on Snapdragon, developers can create apps using various LLMs on #SnapdragonXSeries. Details: https://bit.ly/45J4hye