Low-code framework for building custom LLMs, neural networks, and other AI models
-
Updated
Oct 28, 2024 - Python
Low-code framework for building custom LLMs, neural networks, and other AI models
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
SkyPilot: Run AI and batch jobs on any infra (Kubernetes or 12+ clouds). Get unified execution, cost savings, and high GPU availability via a simple interface.
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://docs.h2o.ai/h2o-llmstudio/
An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
Efficient Triton Kernels for LLM Training
Code examples and resources for DBRX, a large language model developed by Databricks
dstack is an open-source alternative to Kubernetes, designed to simplify development, training, and deployment of AI across any cloud or on-prem. It supports NVIDIA, AMD, and TPU.
DLRover: An Automatic Distributed Deep Learning System
Nvidia GPU exporter for prometheus using nvidia-smi binary
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
irresponsible innovation. Try now at https://chat.dev/
LLM (Large Language Model) FineTuning
The official repo of Aquila2 series proposed by BAAI, including pretrained & chat large language models.
Open Source LLM toolkit to build trustworthy LLM applications. TigerArmor (AI safety), TigerRAG (embedding, RAG), TigerTune (fine-tuning)
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
Add a description, image, and links to the llm-training topic page so that developers can more easily learn about it.
To associate your repository with the llm-training topic, visit your repo's landing page and select "manage topics."