Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read) #2356

Copy link
Copy link
Open
@markwitt1

Description

@markwitt1
Issue body actions

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

We are getting a RemoteProtocolError often after running a streaming chat completion call for a long time. It always throws after a few minutes (Python 3.11 / FastAPI)

Model: o3-mini

Packages:
openai 1.78.1
httpx 0.28.1

To Reproduce

from fastapi import FastAPI, Request
import openai
import httpx

openai.api_key = "YOUR_KEY"

app = FastAPI()

@app.post("/chat-stream")
async def chat_stream(request: Request):
# parse incoming JSON (e.g. {"messages":[...]}):
body = await request.json()
# kick off OpenAI streaming chat:
stream = await openai.ChatCompletion.acreate(
model="o3-mini",
messages=body["messages"],
stream=True,
)
async def event_generator():
try:
async for chunk in stream:
yield chunk.choices[0].delta.get("content", "")
finally:
# cleanup if needed
pass

return httpx.StreamingResponse(event_generator(), media_type="text/plain")

Code snippets

OS

macOS

Python version

Python 3.11.11

Library version

openai v1.78.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.