Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Conversation

@aponcedeleonch
Copy link
Member

Closes: #883

There were a couple of nits that prevented support for llamacpp:

  1. The models method was not implemented in the provider
  2. A way of specifying the model outside CompletionHandler

These PR takes care of both

jhrozek
jhrozek previously approved these changes Feb 3, 2025
Closes: #883

There were a couple of nits that prevented support for llamacpp:
1. The `models` method was not implemented in the provider
2. A way of specifying the model outside `CompletionHandler`

These PR takes care of both
@lukehinds lukehinds merged commit 4e032d9 into main Feb 4, 2025
9 checks passed
@lukehinds lukehinds deleted the llamacpp-mux branch February 4, 2025 07:47
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Task]: Add llamacpp mux support

4 participants