forked from conda-forge/llama-cpp-python-feedstock
-
Notifications
You must be signed in to change notification settings - Fork 0
Update llama cpp python 0.3.16 #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
xkong-anaconda
wants to merge
26
commits into
main
Choose a base branch
from
update-llama-cpp-python-0.3.16
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
26 commits
Select commit
Hold shift + click to select a range
de67658
Update llama-cpp-python to 0.3.16
xkong-anaconda 937bb68
update
xkong-anaconda bed88cc
Fix Windows build: exclude make from Windows builds
xkong-anaconda 17be2fd
Build vendored llama.cpp instead of using external dependency
xkong-anaconda b4d95df
Disable tool building, only build libraries
xkong-anaconda 2a2765a
Fix overlinking error
xkong-anaconda 9980451
new patch
xkong-anaconda 6ae239b
added llvm-openmp to the host requirements
xkong-anaconda b5d0bd2
Added missing_dso_whitelist for win
xkong-anaconda cf55c44
Created bld.bat to relocate DLLs
xkong-anaconda aeef243
linter fix
xkong-anaconda d8cf8f8
Address PR comment
xkong-anaconda 0581df6
use the external llama.cpp b6188
xkong-anaconda 1c8ea2d
Update recipe/meta.yaml
xkong-anaconda 64080e3
Update recipe/meta.yaml
xkong-anaconda b249de5
Update recipe/meta.yaml
xkong-anaconda b4a4b89
Update recipe/meta.yaml
xkong-anaconda 3fe5f24
Update recipe/meta.yaml
xkong-anaconda a59813e
Update recipe/meta.yaml
xkong-anaconda 3a68363
Update recipe/meta.yaml
xkong-anaconda 4366976
Update recipe/meta.yaml
xkong-anaconda 66d81c0
Update recipe/meta.yaml
xkong-anaconda be75f42
Update recipe/meta.yaml
xkong-anaconda 2fa7548
Fix missing [unix] selector on pip install line
xkong-anaconda 58a6f37
fix
xkong-anaconda c5d3e1e
Add cmake to host requirements
xkong-anaconda File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,91 @@ | ||
| From 8156a3728b89cbb944abf5af8376100da8832965 Mon Sep 17 00:00:00 2001 | ||
| From: Julien Jerphanion <[email protected]> | ||
| Date: Fri, 22 Aug 2025 10:22:47 +0200 | ||
| Subject: [PATCH] Adapt shared library relocation | ||
|
|
||
| Signed-off-by: Julien Jerphanion <[email protected]> | ||
| --- | ||
| llama_cpp/_ctypes_extensions.py | 11 +++++++++-- | ||
| llama_cpp/llama_cpp.py | 13 +++++++++++++ | ||
| 2 files changed, 22 insertions(+), 2 deletions(-) | ||
|
|
||
| diff --git a/llama_cpp/_ctypes_extensions.py b/llama_cpp/_ctypes_extensions.py | ||
| index e88ed38..0acd159 100644 | ||
| --- a/llama_cpp/_ctypes_extensions.py | ||
| +++ b/llama_cpp/_ctypes_extensions.py | ||
| @@ -29,16 +29,21 @@ def load_shared_library(lib_base_name: str, base_path: pathlib.Path): | ||
| if sys.platform.startswith("linux") or sys.platform.startswith("freebsd"): | ||
| lib_paths += [ | ||
| base_path / f"lib{lib_base_name}.so", | ||
| + f"lib{lib_base_name}.so", | ||
| ] | ||
| elif sys.platform == "darwin": | ||
| lib_paths += [ | ||
| base_path / f"lib{lib_base_name}.so", | ||
| base_path / f"lib{lib_base_name}.dylib", | ||
| + f"{lib_base_name}.so", | ||
| + f"lib{lib_base_name}.dylib", | ||
| ] | ||
| elif sys.platform == "win32": | ||
| lib_paths += [ | ||
| base_path / f"{lib_base_name}.dll", | ||
| base_path / f"lib{lib_base_name}.dll", | ||
| + f"{lib_base_name}.dll", | ||
| + f"lib{lib_base_name}.dll", | ||
| ] | ||
| else: | ||
| raise RuntimeError("Unsupported platform") | ||
| @@ -62,14 +67,16 @@ def load_shared_library(lib_base_name: str, base_path: pathlib.Path): | ||
|
|
||
| # Try to load the shared library, handling potential errors | ||
| for lib_path in lib_paths: | ||
| - if lib_path.exists(): | ||
| + if isinstance(lib_path, str) or lib_path.exists(): | ||
| try: | ||
| return ctypes.CDLL(str(lib_path), **cdll_args) # type: ignore | ||
| + except OSError: | ||
| + pass | ||
| except Exception as e: | ||
| raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}") | ||
|
|
||
| raise FileNotFoundError( | ||
| - f"Shared library with base name '{lib_base_name}' not found" | ||
| + f"Shared library with base name '{lib_base_name}' not found in {lib_paths}." | ||
| ) | ||
|
|
||
|
|
||
| diff --git a/llama_cpp/llama_cpp.py b/llama_cpp/llama_cpp.py | ||
| index 711d42a..a23c778 100644 | ||
| --- a/llama_cpp/llama_cpp.py | ||
| +++ b/llama_cpp/llama_cpp.py | ||
| @@ -3,6 +3,7 @@ from __future__ import annotations | ||
| import os | ||
| import ctypes | ||
| import pathlib | ||
| +import sys | ||
|
|
||
| from typing import ( | ||
| Callable, | ||
| @@ -32,7 +33,19 @@ if TYPE_CHECKING: | ||
|
|
||
| # Specify the base name of the shared library to load | ||
| _lib_base_name = "llama" | ||
| + | ||
| _override_base_path = os.environ.get("LLAMA_CPP_LIB_PATH") | ||
| +if sys.platform.startswith("win") and _override_base_path is None: | ||
| + # llama.cpp windows' builds' DLL are stored in: `$CONDA_PREFIX/Library/bin/` | ||
| + # We cannot assume that `$CONDA_PREFIX` is set, so we will use this | ||
| + # file position to determine the prefix directory. | ||
| + | ||
| + # This file directory in the prefix: `$CONDA_PREFIX/lib/site-packages/llama_cpp` | ||
| + __this_file_dir = pathlib.Path(os.path.abspath(os.path.dirname(__file__))) | ||
| + # Prefix directory: `$CONDA_PREFIX` | ||
| + __prefix_dir = __this_file_dir.parent.parent.parent | ||
| + _override_base_path = __prefix_dir / "Library" / "bin" | ||
| + | ||
| _base_path = pathlib.Path(os.path.abspath(os.path.dirname(__file__))) / "lib" if _override_base_path is None else pathlib.Path(_override_base_path) | ||
| # Load the library | ||
| _lib = load_shared_library(_lib_base_name, _base_path) | ||
| -- | ||
| 2.50.1 | ||
|
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,7 @@ | ||
| :: Set CMake arguments to use external llama.cpp library | ||
| set CMAKE_ARGS=%CMAKE_ARGS% -DLLAMA_BUILD=OFF | ||
| set CMAKE_ARGS=%CMAKE_ARGS% -DLLAVA_BUILD=OFF | ||
|
|
||
| :: Install the package | ||
| %PYTHON% -m pip install . -vv --no-deps --no-build-isolation | ||
| if errorlevel 1 exit 1 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.