-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add LLM query example with Ollama #94
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, is there a larger dataset of queries we could use to show performance benefits over python?
Running LLMs locally without GPUs is very slow so larger data would make it not usable. |
Ollama is meant for local runs, smaller scale w/o GPUs. Next we can use
OpenAI compatible APIs, via https://github.com/andrewyng/aisuite or
LiteLLM, at larger scale, and see how that helps.
…On Mon, Dec 30, 2024 at 10:03 AM Ehsan Totoni ***@***.***> wrote:
Running LLMs locally without GPUs is very slow so larger data would make
it not usable.
—
Reply to this email directly, view it on GitHub
<#94 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AE7INWKHOM6WHXL3B3WDRHL2IGDF7AVCNFSM6AAAAABUJE3ALSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRVG43TENJWGI>
.
You are receiving this because your review was requested.Message ID:
***@***.***>
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd edit to model="llama3", or even make that an outside variable, to keep it cleaner. And add a comment next to raw_prompts*10 that this just runs the examples 10x times.
Done. |
Changes included in this PR
Adds an example that demonstrates preprocessing data and querying an LLM using Bodo (with Ollama).
Testing strategy
Tested it locally.
User facing changes
A new example.
Checklist
[run CI]
in your commit message.