-
It seems ChatCompletionAgent takes PromptExecutionSettings as the execution settings which does not include Temperature, MaxToken, PresencePenalty, etc that are in OpenAIPromptExecutionSettings. I tried using OpenAIPromptExecutionSettings instead, but it seems the extra properties are being ignored. Is this by design? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 3 replies
-
I would not expect these to be ignored. I'll investigate and reply with my findings. |
Beta Was this translation helpful? Give feedback.
-
For |
Beta Was this translation helpful? Give feedback.
-
Yes, I believe so. Will do some verification on this end to make sure. |
Beta Was this translation helpful? Give feedback.
-
Confirmed that You can confirm this by setting |
Beta Was this translation helpful? Give feedback.
Confirmed that
OpenAIPromptExecutionSettings
and respected byChatCompletionAgent
(iterated onMaxTokens
,TopP
andTemperature
.You can confirm this by setting
Temperature
to a value greater than 2 orMaxTokens
very low (i.e. 10). Either of these should result in an error (which demonstrates they are included in the model request).