-
Notifications
You must be signed in to change notification settings - Fork 199
LLMParams and Responses API #645
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Qodana for JVM855 new problems were found
@@ Code coverage @@
+ 65% total lines covered
10615 lines analyzed, 6973 lines covered
# Calculated according to the filters of your coverage tool ☁️ View the detailed Qodana report Contact Qodana teamContact us at [email protected]
|
3fcce96
to
7773563
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces LLMParams and Responses API functionality to extend model parameter customization capabilities. The changes enable provider-specific parameter handling, add support for OpenAI's Responses API, and introduce new OpenRouter model definitions.
- Refactors LLMParams from data class to open class with custom equals/hashCode/toString methods
- Adds comprehensive OpenAI Responses API implementation with streaming support
- Introduces provider-specific parameter classes (OpenAIParams, OpenRouterParams) extending base LLMParams
- Adds new GPT-5 model variants and gpt-oss-120b model to OpenRouter definitions
- Refactors OpenAI client architecture to support multiple API endpoints via generics
Reviewed Changes
Copilot reviewed 19 out of 20 changed files in this pull request and generated 2 comments.
Show a summary per file
File | Description |
---|---|
LLMParams.kt | Converts data class to open class with manual implementation of copy/equals/hashCode |
LLMCapability.kt | Adds OpenAIEndpoint capability for Completions and Responses APIs |
OpenRouterChatCompletion.kt | New OpenRouter-specific request/response models with provider preferences |
OpenRouterParams.kt | OpenRouter parameter class with validation and provider-specific options |
OpenRouterModels.kt | Adds GPT-5, GPT-5 Mini, GPT-5 Nano, and GPT-OSS-120b model definitions |
OpenRouterLLMClient.kt | Refactors to use generic AbstractOpenAILLMClient with provider-specific serialization |
OpenAIDataModels.kt | Refactors to extract base interfaces and move chat completion models to separate file |
AbstractOpenAILLMClient.kt | Major refactor to generic class supporting multiple response types via abstract methods |
OpenAIResponsesAPI.kt | Comprehensive OpenAI Responses API implementation with 2300+ lines of models |
OpenAIChatCompletion.kt | Extracted chat completion models from main data models file |
OpenAIParams.kt | New OpenAI-specific parameter classes for Chat and Responses APIs |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
...-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/openrouter/OpenRouterParams.kt
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you! Everything looks great! Just a couple of small suggestions:
- Please compare the OpenRouter params with the spec (or at least make sure all the ones we pass in the request are included in the params class)
- I think the manual toString implementation comes from the fact that params isn’t a data class. But maybe there’s a way around this? For example, making them serializable and outputting JSON. Otherwise, we risk forgetting to update this method whenever the params list changes.
...-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/openrouter/OpenRouterParams.kt
Outdated
Show resolved
Hide resolved
...-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/openrouter/OpenRouterParams.kt
Show resolved
Hide resolved
...monMain/kotlin/ai/koog/prompt/executor/clients/openrouter/models/OpenRouterChatCompletion.kt
Show resolved
Hide resolved
...monMain/kotlin/ai/koog/prompt/executor/clients/openrouter/models/OpenRouterChatCompletion.kt
Show resolved
Hide resolved
...ient/src/commonMain/kotlin/ai/koog/prompt/executor/clients/openrouter/OpenRouterLLMClient.kt
Show resolved
Hide resolved
@tiginamaria thank you for your review
Since this is a subclass, we cannot make it a data class |
… `DeepSeekParams` inheriting additional specialized parameters.
…nAIResponsesParams`. Add support for `Truncation` enum.
…troduce `type` field for item differentiation.
…field types, and improve null safety for attributes.
…ser` and integration tests.
…o `LLMModelParser` and `OpenRouterModels`.
…dentifierParsingTest`.
…penRouter, OpenAI, and DeepSeek clients. Simplify class hierarchy by removing unnecessary `open` modifiers.
0cdf530
to
33c66be
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you! I have just one clarification question caused by missing docs, except this all is fine!
...r-openai-client/src/commonMain/kotlin/ai/koog/prompt/executor/clients/openai/OpenAIModels.kt
Show resolved
Hide resolved
prompt/prompt-llm/src/commonMain/kotlin/ai/koog/prompt/llm/LLMCapability.kt
Show resolved
Hide resolved
prompt/prompt-executor/prompt-executor-clients/prompt-executor-deepseek-client/Module.md
Outdated
Show resolved
Hide resolved
prompt/prompt-executor/prompt-executor-clients/prompt-executor-deepseek-client/Module.md
Outdated
Show resolved
Hide resolved
prompt/prompt-executor/prompt-executor-clients/prompt-executor-openrouter-client/Module.md
Outdated
Show resolved
Hide resolved
prompt/prompt-model/src/commonMain/kotlin/ai/koog/prompt/params/LLMParams.kt
Show resolved
Hide resolved
- Fix message generation in OpenAILLMClient history compression. Use Output message for Assistant messages
8983351
to
e719707
Compare
…nto devcrocod/providers-llmparams
It is still possible to use OpenAI ResponsesAPI by explicitly providing OpenAIResponsesParams
KG-220 Ability to pass custom LLM params
Breaking Changes
Type of the changes
Checklist
develop
as the base branchAdditional steps for pull requests adding a new feature