-
Notifications
You must be signed in to change notification settings - Fork 193
Allow getting multiple replies in a single call of an LLM #260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow getting multiple replies in a single call of an LLM #260
Conversation
@@ -6,6 +6,8 @@ import ai.koog.prompt.llm.LLModel | |||
import ai.koog.prompt.message.Message | |||
import kotlinx.coroutines.flow.Flow | |||
|
|||
public typealias LLMReply = List<Message.Response> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or maybe LLMChoice? WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Both are reasonable. LLMChoice would suggest that one should choose from the responses, while afaiu one of the use cases is to use all responses, that's why I though it's better to use LLMReply and not focus only on choice use case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aren't they called choices in OpenAi?
# Conflicts: # prompt/prompt-executor/prompt-executor-clients/src/commonMain/kotlin/ai/koog/prompt/executor/clients/LLMClient.kt
# Conflicts: # prompt/prompt-executor/prompt-executor-clients/src/commonMain/kotlin/ai/koog/prompt/executor/clients/LLMClient.kt
…le-replies' into belyshev/llm-request-with-multiple-replies
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@antoniibelyshev please fix my comments above, and after that -- please feel free to merge this change
import ai.koog.prompt.llm.LLModel | ||
import ai.koog.prompt.message.Message | ||
|
||
public class PromptExecutorChoice( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add KDoc here and to all other public API
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let’s actually also rename it to PromptExecutorWithChoiceSelection , wdyt?
import ai.koog.prompt.dsl.Prompt | ||
import ai.koog.prompt.executor.model.LLMChoice | ||
|
||
public class DummyChoiceStrategy : ChoiceStrategy { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add KDoc. And also let's call it not Dummy but something like ChoiceStrategy.FirstChoice
Also let's make it an object and move inside ChoiceStrategy.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another option is to call it just "Default" so that users would access it as ChoiceStrategy.Default.
Another note -- maybe rename ChoiceStrategy to ChoiceSelectionStrategy? Wdyt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or LLMChoiceSelectionStrategy even? Or is it too long?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The names for the override classes will be even longer :) I think ChoiceSelectionStrategy will be enough to understand the idea behind the interface
* @property print A function responsible for displaying messages to the user, e.g., for showing prompts or feedback. | ||
* @property read A function to capture user input. | ||
*/ | ||
public class AskUserChoiceStrategy( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it needed in public API?
Maybe let's move it to tests or examples?
edge(nodeSendToolResult forwardTo nodeExecuteTool onToolCall { true }) | ||
} | ||
|
||
val askChoiceStrategy = AskUserChoiceStrategy(promptShowToUser = { prompt -> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's move the whole AskUserChoiceStrategy to this chess example
val nodeCallLLM by nodeLLMRequest("sendInput") | ||
val nodeExecuteTool by nodeExecuteTool("nodeExecuteTool") | ||
val nodeSendToolResult by nodeLLMSendResultsMultipleChoices("nodeSendToolResult") | ||
val nodeChooseChoice by nodeChoose(askChoiceStrategy, "chooseLLMChoice") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's rename "nodeChoose" to something more verbose like "nodeChooseLLMResponse"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean in the public API also
Added
executeMultipleReplies
method to the prompt executor and clients to support LLM requests with multiple replies.Implemented multi-reply handling in the OpenAI client.
Introduced a reply choice strategy and dedicated prompt executor.
Created dedicated prompt executor and nodes to receive and handle multiple replies in an agent.
Type of the change
Checklist for all pull requests
develop
as the base branchAdditional steps for pull requests adding a new feature