-
Notifications
You must be signed in to change notification settings - Fork 218
Closed
Description
Currently, when using LLMClient.executeStreaming
, it returns only a flow of content chunks Flow<String>
. This is sufficient for simple cases, but not enough for production applications that require displaying more information to the user.
Things that I missed:
- finish reason (in the same way as
LLMClient.execute
has), we don't know a reason why streaming suddenly stopped (some error, or it was the expected ending) - All the errors that happen during the stream are skipped. Take, for example,
AnthropicLLMClient
:
event
.takeIf { it.event == "content_block_delta" } // "error" event is skipped
?.data?.trim()?.let { json.decodeFromString<AnthropicStreamResponse>(it) }
?.delta?.text?.let { emit(it) }
- I think it was already mentioned here, we don't save the message ID received from the provider
Currently, in my project, I use this class to represent the stream result from any provider:
sealed interface StreamResult {
data class Chunk(
val id: String,
val content: String,
) : StreamResult
data class Finished(
val id: String,
val reason: FinishReason?,
) : StreamResult
data class Error(val error: String?) : StreamResult
}
It would be nice to see something similar in Koog.
ptitjes, bkenn, dosier and hiking93
Metadata
Metadata
Assignees
Labels
No labels