Class AzureOpenAiClient
- All Implemented Interfaces:
AiProviderClient
AiProviderClient implementation for
Azure
OpenAI Service deployments.
Azure OpenAI exposes a chat completions API that is structurally similar to the public OpenAI API but differs in three important ways:
- Endpoint structure — the deployment name is embedded in
the path rather than supplied as a JSON field:
{baseUrl}/openai/deployments/{deployment}/chat/completions?api-version={version} - Authentication header — requests carry an
api-keyheader instead of the standardAuthorization: Bearerform used by the public OpenAI API - Model identifier —
AiOptions.modelName()is interpreted as the Azure deployment name, not the underlying model family name; the deployment name is chosen when the resource is configured in the Azure portal
These differences are fully encapsulated within this class. The request and
response JSON structures are identical to those used by
OpenAiCompatibleClient, allowing the same prompt builder and response
normalization logic to be reused.
Data Residency
Requests are sent to a resource endpoint within the organization's own Azure tenant. Data does not leave the tenant boundary, making this provider suitable for regulated environments where source code must not be transmitted to third-party cloud services.
Operational Responsibilities
- constructing the Azure-specific deployment endpoint URL
- injecting the
api-keyauthentication header - constructing and submitting chat completion requests
- extracting JSON content from the model response
- normalizing the result into
AiClassSuggestion
Instances are typically created through
AiProviderFactory.create(AiOptions).
- See Also:
-
Constructor Summary
ConstructorsConstructorDescriptionAzureOpenAiClient(AiOptions options) Creates a new Azure OpenAI client with no rate-limit notification.AzureOpenAiClient(AiOptions options, RateLimitListener rateLimitListener) Creates a new Azure OpenAI client that notifiesrateLimitListenerbefore each rate-limit sleep. -
Method Summary
Modifier and TypeMethodDescriptionbooleanDetermines whether this client can be used in the current runtime environment.suggestForClass(String fqcn, String classSource, String taxonomyText, List<PromptBuilder.TargetMethod> targetMethods) Submits a classification request to the configured Azure OpenAI deployment.
-
Constructor Details
-
AzureOpenAiClient
Creates a new Azure OpenAI client with no rate-limit notification.Rate-limit pauses are handled transparently. Use
AzureOpenAiClient(AiOptions, RateLimitListener)when callers need to be notified of such pauses.The supplied configuration must provide:
AiOptions.baseUrl()— resource endpoint, e.g.https://contoso.openai.azure.comAiOptions.modelName()— deployment name as configured in the Azure portalAiOptions.apiVersion()— REST API version, e.g.2024-02-01AiOptions.resolvedApiKey()— resource-scoped API key
- Parameters:
options- AI runtime configuration
-
AzureOpenAiClient
Creates a new Azure OpenAI client that notifiesrateLimitListenerbefore each rate-limit sleep.- Parameters:
options- AI runtime configurationrateLimitListener- callback invoked before each HTTP 429 pause; must not benull- See Also:
-
-
Method Details
-
isAvailable
public boolean isAvailable()Determines whether this client can be used in the current runtime environment.Availability requires a non-blank API key resolved through
AiOptions.resolvedApiKey().- Specified by:
isAvailablein interfaceAiProviderClient- Returns:
trueif a usable API key is available
-
suggestForClass
public AiClassSuggestion suggestForClass(String fqcn, String classSource, String taxonomyText, List<PromptBuilder.TargetMethod> targetMethods) throws AiSuggestionException Submits a classification request to the configured Azure OpenAI deployment.The request is sent to the deployment-specific endpoint:
{baseUrl}/openai/deployments/{modelName}/chat/completions?api-version={apiVersion}The request payload includes:
- the deployment name as the
modelfield - a system prompt defining classification rules
- a user prompt containing the test class source and taxonomy
- a deterministic temperature setting of
0.0
Authentication uses the
api-keyHTTP header carrying the value returned byAiOptions.resolvedApiKey().- Specified by:
suggestForClassin interfaceAiProviderClient- Parameters:
fqcn- fully qualified class name being analyzedclassSource- complete source code of the classtaxonomyText- taxonomy definition guiding classificationtargetMethods- deterministically extracted JUnit test methods that must be classified- Returns:
- normalized classification result
- Throws:
AiSuggestionException- if the provider request fails, the model response is invalid, or JSON deserialization fails- See Also:
- the deployment name as the
-