Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 52753b7

Browse filesBrowse files
committed
Upgrade fastapi to 0.100.0 and pydantic v2
1 parent 11eae75 commit 52753b7
Copy full SHA for 52753b7

File tree

Expand file treeCollapse file tree

9 files changed

+15
-21
lines changed
Filter options
Expand file treeCollapse file tree

9 files changed

+15
-21
lines changed

‎.github/workflows/test.yaml

Copy file name to clipboardExpand all lines: .github/workflows/test.yaml
+3-3Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ jobs:
2626
python-version: ${{ matrix.python-version }}
2727
- name: Install dependencies
2828
run: |
29-
python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi sse-starlette httpx uvicorn
29+
python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi sse-starlette httpx uvicorn pydantic-settings
3030
pip install . -v
3131
- name: Test with pytest
3232
run: |
@@ -49,7 +49,7 @@ jobs:
4949
python-version: ${{ matrix.python-version }}
5050
- name: Install dependencies
5151
run: |
52-
python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi sse-starlette httpx uvicorn
52+
python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi sse-starlette httpx uvicorn pydantic-settings
5353
pip install . -v
5454
- name: Test with pytest
5555
run: |
@@ -72,7 +72,7 @@ jobs:
7272
python-version: ${{ matrix.python-version }}
7373
- name: Install dependencies
7474
run: |
75-
python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi sse-starlette httpx uvicorn
75+
python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi sse-starlette httpx uvicorn pydantic-settings
7676
pip install . -v
7777
- name: Test with pytest
7878
run: |

‎docker/cuda_simple/Dockerfile

Copy file name to clipboardExpand all lines: docker/cuda_simple/Dockerfile
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ COPY . .
88

99
# Install the package
1010
RUN apt update && apt install -y python3 python3-pip
11-
RUN python3 -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette
11+
RUN python3 -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette pydantic-settings
1212

1313
RUN LLAMA_CUBLAS=1 pip install llama-cpp-python
1414

‎docker/open_llama/Dockerfile

Copy file name to clipboardExpand all lines: docker/open_llama/Dockerfile
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ RUN apt-get update && apt-get upgrade -y && apt-get install -y --no-install-reco
1414
ninja-build \
1515
build-essential
1616

17-
RUN python3 -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette
17+
RUN python3 -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette pydantic-settings
1818

1919
# Perform the conditional installations based on the image
2020
RUN echo "Image: ${IMAGE}" && \

‎docker/openblas_simple/Dockerfile

Copy file name to clipboardExpand all lines: docker/openblas_simple/Dockerfile
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ COPY . .
77

88
# Install the package
99
RUN apt update && apt install -y libopenblas-dev ninja-build build-essential
10-
RUN python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette
10+
RUN python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette pydantic-settings
1111

1212
RUN LLAMA_OPENBLAS=1 pip install llama_cpp_python --verbose
1313

‎docker/simple/Dockerfile

Copy file name to clipboardExpand all lines: docker/simple/Dockerfile
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ RUN mkdir /app
1818
WORKDIR /app
1919
COPY . /app
2020

21-
RUN python3 -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette
21+
RUN python3 -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette pydantic-settings
2222

2323
RUN make build && make clean
2424

‎llama_cpp/server/__main__.py

Copy file name to clipboardExpand all lines: llama_cpp/server/__main__.py
+2-2Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
To run this example:
44
55
```bash
6-
pip install fastapi uvicorn sse-starlette
6+
pip install fastapi uvicorn sse-starlette pydantic-settings
77
export MODEL=../models/7B/...
88
```
99
@@ -30,7 +30,7 @@
3030

3131
if __name__ == "__main__":
3232
parser = argparse.ArgumentParser()
33-
for name, field in Settings.__fields__.items():
33+
for name, field in Settings.__model_fields__.items():
3434
description = field.field_info.description
3535
if field.default is not None and description is not None:
3636
description += f" (default: {field.default})"

‎llama_cpp/server/app.py

Copy file name to clipboardExpand all lines: llama_cpp/server/app.py
+4-10Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,8 @@
1212
from starlette.concurrency import run_in_threadpool, iterate_in_threadpool
1313
from fastapi import Depends, FastAPI, APIRouter, Request
1414
from fastapi.middleware.cors import CORSMiddleware
15-
from pydantic import BaseModel, BaseSettings, Field, create_model_from_typeddict
15+
from pydantic import BaseModel, Field
16+
from pydantic_settings import BaseSettings
1617
from sse_starlette.sse import EventSourceResponse
1718

1819

@@ -309,7 +310,6 @@ class Config:
309310
}
310311

311312

312-
CreateCompletionResponse = create_model_from_typeddict(llama_cpp.Completion)
313313

314314

315315
def make_logit_bias_processor(
@@ -347,7 +347,6 @@ def logit_bias_processor(
347347

348348
@router.post(
349349
"/v1/completions",
350-
response_model=CreateCompletionResponse,
351350
)
352351
async def create_completion(
353352
request: Request,
@@ -416,12 +415,10 @@ class Config:
416415
}
417416

418417

419-
CreateEmbeddingResponse = create_model_from_typeddict(llama_cpp.Embedding)
420418

421419

422420
@router.post(
423421
"/v1/embeddings",
424-
response_model=CreateEmbeddingResponse,
425422
)
426423
async def create_embedding(
427424
request: CreateEmbeddingRequest, llama: llama_cpp.Llama = Depends(get_llama)
@@ -479,19 +476,17 @@ class Config:
479476
}
480477

481478

482-
CreateChatCompletionResponse = create_model_from_typeddict(llama_cpp.ChatCompletion)
483479

484480

485481
@router.post(
486482
"/v1/chat/completions",
487-
response_model=CreateChatCompletionResponse,
488483
)
489484
async def create_chat_completion(
490485
request: Request,
491486
body: CreateChatCompletionRequest,
492487
llama: llama_cpp.Llama = Depends(get_llama),
493488
settings: Settings = Depends(get_settings),
494-
) -> Union[llama_cpp.ChatCompletion, EventSourceResponse]:
489+
) -> Union[llama_cpp.ChatCompletion]: # type: ignore
495490
exclude = {
496491
"n",
497492
"logit_bias",
@@ -551,10 +546,9 @@ class ModelList(TypedDict):
551546
data: List[ModelData]
552547

553548

554-
GetModelResponse = create_model_from_typeddict(ModelList)
555549

556550

557-
@router.get("/v1/models", response_model=GetModelResponse)
551+
@router.get("/v1/models")
558552
async def get_models(
559553
settings: Settings = Depends(get_settings),
560554
) -> ModelList:

‎pyproject.toml

Copy file name to clipboardExpand all lines: pyproject.toml
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ httpx = "^0.24.1"
3232
scikit-build = "0.17.6"
3333

3434
[tool.poetry.extras]
35-
server = ["uvicorn", "fastapi", "sse-starlette"]
35+
server = ["uvicorn>=0.22.0", "fastapi>=0.100.0", "pydantic-settings>=2.0.1", "sse-starlette>=1.6.1"]
3636

3737
[build-system]
3838
requires = [

‎setup.py

Copy file name to clipboardExpand all lines: setup.py
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
packages=["llama_cpp", "llama_cpp.server"],
1919
install_requires=["typing-extensions>=4.5.0", "numpy>=1.20.0", "diskcache>=5.6.1"],
2020
extras_require={
21-
"server": ["uvicorn>=0.21.1", "fastapi>=0.95.0", "sse-starlette>=1.3.3"],
21+
"server": ["uvicorn>=0.22.1", "fastapi>=0.100.0", "pydantic-settings>=2.0.1", "sse-starlette>=1.6.1"],
2222
},
2323
python_requires=">=3.7",
2424
classifiers=[

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.