Skip to main content
Solved

Only certain models supported when using Azure OpenAI?


Experimenting with our options to use AI tooling in the Software Factory, I struggled to even get a basic example working.

We already have for other applications (none SF) some models in use on Azure. So for our experiment it would be handy to to use an existing resource and went with o3-mini.

It always gave a “Unsuccessful (invalid request)” when looking in the Process Flow monitor. So perhaps the Resource Name need to include the whole url including the api-version, but that does not fit the field.

Eventually we just tried another model, namely: gpt-4o-mini.

Eureka! It seems this works straight away.

So the question. Are the Models to choose for use with the Software Factory limited by certain types?

Best answer by Anne Buit

Please raise a ticket for this issue; it seems like the temperature parameter or the max tokens is not supported by the o3-mini model. The internal default values for these settings cause the issue.

The problem will have to be resolved in a library used by Indicium. I’m not certain the library has been fixed yet, but the issue has been raised by others as well.

View original
Did this topic help you find an answer to your question?

4 replies

Anne Buit
Community Manager
Forum|alt.badge.img+5
  • Community Manager
  • 653 replies
  • May 14, 2025

Hi Mark,

Are you referring to Azure OpenAI? The models should not be limited in any way.

The model specified in the Generative AI provider should be the Azure OpenAI model ID or deployment name.

The Resource Name field is a bit misleading as it should indeed point to the endpoint, which can be found under Resource Management > Keys and Endpoints or Overview > Develop

Resource “Name”

Did you configure it this way?


Anne Buit
Community Manager
Forum|alt.badge.img+5
  • Community Manager
  • 653 replies
  • May 14, 2025

Update: I’ve just been able to reproduce your exact scenario.

Using the deployed o3-mini model gave a 400 and using the deployed gpt-4o-mini works fine.

I’ll look into it a bit further.


Anne Buit
Community Manager
Forum|alt.badge.img+5
  • Community Manager
  • 653 replies
  • Answer
  • May 14, 2025

Please raise a ticket for this issue; it seems like the temperature parameter or the max tokens is not supported by the o3-mini model. The internal default values for these settings cause the issue.

The problem will have to be resolved in a library used by Indicium. I’m not certain the library has been fixed yet, but the issue has been raised by others as well.


Thanks for the research Anne. I have raised a ticket in TCP: 11698S


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings