Skip to content

How do I set Temperature to ChatCompletionAgent #12176

Answered by crickman
piffy76 asked this question in Q&A
Discussion options

You must be logged in to vote

Confirmed that OpenAIPromptExecutionSettings and respected by ChatCompletionAgent (iterated on MaxTokens, TopP and Temperature.

You can confirm this by setting Temperature to a value greater than 2 or MaxTokens very low (i.e. 10). Either of these should result in an error (which demonstrates they are included in the model request).

Replies: 4 comments 3 replies

Comment options

You must be logged in to vote
1 reply
@piffy76
Comment options

Comment options

You must be logged in to vote
1 reply
@piffy76
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@piffy76
Comment options

Answer selected by crickman
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
.NET Issue or Pull requests regarding .NET code agents
2 participants