Closed
Description
How can I see my prompt with generated text in message content after use:
chat = llm.create_chat_completion(
messages,
tools=tools
)
print(chat)
{'id': 'chatcmpl-e2235484-1ccb-4eb7-a93b-e9e171fbb2ee',
'object': 'chat.completion',
'created': 1725972811,
'model': 'model.gguf',
'choices': [{'index': 0,
'message': {'role': 'assistant',
'content': 'Hello! ....'},
'logprobs': None,
'finish_reason': 'stop'}],
'usage': {'prompt_tokens': 40, 'completion_tokens': 173, 'total_tokens': 213}}
Metadata
Metadata
Assignees
Labels
No labels