Chat
class Chat
Representation of a multi-turn interaction with a model.
Captures and stores the history of communication in memory, and provides it as context with each new message.
Note: This object is not thread-safe, and calling sendMessage multiple times without waiting for a response will throw an InvalidStateException.
Summary
Public constructors |
|---|
Chat(model: GenerativeModel, history: MutableList<Content>) |
Public functions |
|
|---|---|
suspend GenerateContentResponse |
sendMessage(prompt: Bitmap)Sends a message using the existing history of this chat as context and the provided image prompt. |
suspend GenerateContentResponse |
sendMessage(prompt: Content)Sends a message using the provided |
suspend GenerateContentResponse |
sendMessage(prompt: String)Sends a message using the provided |
Flow<GenerateContentResponse> |
sendMessageStream(prompt: Bitmap)Sends a message using the existing history of this chat as context and the provided image prompt. |
Flow<GenerateContentResponse> |
sendMessageStream(prompt: Content)Sends a message using the existing history of this chat as context and the provided |
Flow<GenerateContentResponse> |
sendMessageStream(prompt: String)Sends a message using the existing history of this chat as context and the provided text prompt. |
Public properties |
|
|---|---|
MutableList<Content> |
The previous content from the chat that has been successfully sent and received from the model. |
Public constructors
Chat
Chat(model: GenerativeModel, history: MutableList<Content> = ArrayList())
| Parameters | |
|---|---|
model: GenerativeModel |
The model to use for the interaction. |
Public functions
sendMessage
suspend fun sendMessage(prompt: Bitmap): GenerateContentResponse
Sends a message using the existing history of this chat as context and the provided image prompt.
If successful, the message and response will be added to the history. If unsuccessful, history will remain unchanged.
| Parameters | |
|---|---|
prompt: Bitmap |
The input that, together with the history, will be given to the model as the prompt. |
| Throws | |
|---|---|
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if |
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if the |
sendMessage
suspend fun sendMessage(prompt: Content): GenerateContentResponse
Sends a message using the provided prompt; automatically providing the existing history as context.
If successful, the message and response will be added to the history. If unsuccessful, history will remain unchanged.
| Parameters | |
|---|---|
prompt: Content |
The input that, together with the history, will be given to the model as the prompt. |
| Throws | |
|---|---|
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if |
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if the |
sendMessage
suspend fun sendMessage(prompt: String): GenerateContentResponse
Sends a message using the provided text prompt; automatically providing the existing history as context.
If successful, the message and response will be added to the history. If unsuccessful, history will remain unchanged.
| Parameters | |
|---|---|
prompt: String |
The input that, together with the history, will be given to the model as the prompt. |
| Throws | |
|---|---|
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if |
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if the |
sendMessageStream
fun sendMessageStream(prompt: Bitmap): Flow<GenerateContentResponse>
Sends a message using the existing history of this chat as context and the provided image prompt.
The response from the model is returned as a stream.
If successful, the message and response will be added to the history. If unsuccessful, history will remain unchanged.
| Parameters | |
|---|---|
prompt: Bitmap |
The input that, together with the history, will be given to the model as the prompt. |
| Throws | |
|---|---|
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if |
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if the |
sendMessageStream
fun sendMessageStream(prompt: Content): Flow<GenerateContentResponse>
Sends a message using the existing history of this chat as context and the provided Content prompt.
The response from the model is returned as a stream.
If successful, the message and response will be added to the history. If unsuccessful, history will remain unchanged.
| Parameters | |
|---|---|
prompt: Content |
The input that, together with the history, will be given to the model as the prompt. |
| Throws | |
|---|---|
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if |
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if the |
sendMessageStream
fun sendMessageStream(prompt: String): Flow<GenerateContentResponse>
Sends a message using the existing history of this chat as context and the provided text prompt.
The response from the model is returned as a stream.
If successful, the message and response will be added to the history. If unsuccessful, history will remain unchanged.
| Parameters | |
|---|---|
prompt: String |
The input(s) that, together with the history, will be given to the model as the prompt. |
| Throws | |
|---|---|
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if |
com.google.firebase.vertexai.type.InvalidStateException: com.google.firebase.vertexai.type.InvalidStateException |
if the |
Public properties
history
val history: MutableList<Content>
The previous content from the chat that has been successfully sent and received from the model. This will be provided to the model for each message sent (as context for the discussion).