Skip to content

Commit aeef243

Browse files
linter fix
1 parent cf55c44 commit aeef243

File tree

2 files changed

+9
-3
lines changed

2 files changed

+9
-3
lines changed

recipe/bld.bat

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ set CMAKE_ARGS=%CMAKE_ARGS% -DLLAMA_BUILD=ON
33
set CMAKE_ARGS=%CMAKE_ARGS% -DLLAVA_BUILD=OFF
44

55
:: Install the package
6-
%PYTHON% -m pip install . -vv
6+
%PYTHON% -m pip install . -vv --no-deps --no-build-isolation
77
if errorlevel 1 exit 1
88

99
:: Move DLLs from site-packages/bin to Library/bin (standard conda location)

recipe/meta.yaml

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,12 +26,12 @@ build:
2626
script: # [unix]
2727
- export CMAKE_ARGS="${CMAKE_ARGS} -DLLAMA_BUILD=ON" # [unix]
2828
- export CMAKE_ARGS="${CMAKE_ARGS} -DLLAVA_BUILD=OFF" # [unix]
29-
- {{ PYTHON }} -m pip install . -vv # [unix]
29+
- {{ PYTHON }} -m pip install . -vv --no-deps --no-build-isolation # [unix]
3030
requirements:
3131
build:
3232
- python # [build_platform != target_platform]
3333
- cross-python_{{ target_platform }} # [build_platform != target_platform]
34-
34+
- {{ stdlib('c') }}
3535
- {{ compiler('c') }}
3636
- {{ compiler('cxx') }}
3737
- cmake
@@ -74,6 +74,12 @@ test:
7474
about:
7575
home: https://github.com/abetlen/llama-cpp-python
7676
summary: Python bindings for the llama.cpp library
77+
description: |
78+
Python bindings for llama.cpp, providing a simple Python interface for
79+
inference with Large Language Models (LLMs) using the llama.cpp backend.
80+
Supports CPU and GPU acceleration with vendored llama.cpp library.
81+
dev_url: https://github.com/abetlen/llama-cpp-python
82+
doc_url: https://llama-cpp-python.readthedocs.io
7783
license: MIT
7884
license_file:
7985
- LICENSE.md

0 commit comments

Comments
 (0)