Experimenting with our options to use AI tooling in the Software Factory, I struggled to even get a basic example working.
We already have for other applications (none SF) some models in use on Azure. So for our experiment it would be handy to to use an existing resource and went with o3-mini.
It always gave a “Unsuccessful (invalid request)” when looking in the Process Flow monitor. So perhaps the Resource Name need to include the whole url including the api-version, but that does not fit the field.
Eventually we just tried another model, namely: gpt-4o-mini.
Eureka! It seems this works straight away.
So the question. Are the Models to choose for use with the Software Factory limited by certain types?
Solved
Only certain models supported when using Azure OpenAI?
Best answer by Anne Buit
Please raise a ticket for this issue; it seems like the temperature parameter or the max tokens is not supported by the o3-mini model. The internal default values for these settings cause the issue.
The problem will have to be resolved in a library used by Indicium. I’m not certain the library has been fixed yet, but the issue has been raised by others as well.
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.