Skip to content

Conversation

Ololoshechkin
Copy link
Collaborator

No description provided.

@devcrocod
Copy link
Contributor

Just to confirm, was LLModel moved from Prompt to LocalAgentConfig in this pr, but the model parameters are still in Prompt?

@Ololoshechkin
Copy link
Collaborator Author

Yes, prompt contains the LLMParams for now. Params can be shared between most of the models (except the case when some model doesn't support one of them, which is pretty rare case), and in fact in many cases they are not touched/changed.

Copy link
Contributor

@EugeneTheDev EugeneTheDev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The change looks good, I agree that it's better to remove model from the prompt. I left a few comments regarding the API part


fun withMessages(newMessages: List<Message>): Prompt = copy(messages = newMessages)

fun withUpdatedMessages(update: MutableList<Message>.() -> Unit): Prompt =
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it have to be mutable? Seems like it works with just regular list too

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dropLast() ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

prompt. withUpdatedMessages { dropLast(2) }


fun withParams(newParams: LLMParams): Prompt = copy(params = newParams)

class LLMParamsUpdateContext internal constructor(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see that you are creating a new class with the same fields as in LLMParams, just to make them mutable. I don't have a strong opinion against it, just wanted to confirm, do you think it's necessary? We will now have to maintain two similar sets of fields, and to always keep in mind that we will have to add new fields in two places. Maybe it would be fine to just ask user to provide a new LLMParam object instead of creating this convenience abstraction? If you think it's still worth it, feel free to disregard this comment

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wanted to allow writing easily (same as with prompt. withUpdatedMessages { dropLast(2) }) :
prompt. withUpdatedParams { temperature = 100500 }

}
if (model != null) changeModel(model)

updatePrompt {
system(systemMessage)
Copy link
Contributor

@EugeneTheDev EugeneTheDev May 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit concerned with system here. There's only a single system message allowed in the prompt (I already shot myself in the leg with this one). This looks like it can lead to the same problem, since it's not rewriting, but updating the prompt. Or am I missing the point?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think OpenAI supports it but Anthropic doesn't. But let's address it separately

}
if (model != null) changeModel(model)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function would change the model and won't restore it back, which is not communicated by the function name. AFAIU it is only an internal utility function for subgraphs, that is used only in one place. So to avoid confusion maybe we can rename it, or, even better, remove it altogether and embed in the usage site?

@tiginamaria tiginamaria force-pushed the vbr/prompt-api-improvements branch from 58610aa to 8ecf792 Compare May 12, 2025 12:32
@tiginamaria tiginamaria merged commit a7d1648 into main May 12, 2025
3 checks passed
@tiginamaria tiginamaria deleted the vbr/prompt-api-improvements branch May 12, 2025 15:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants