Enum Class AiProvider
- All Implemented Interfaces:
Serializable,Comparable<AiProvider>,Constable
AiSuggestionEngine.
Each constant represents a distinct AI platform capable of performing
security classification of test sources. The provider selected through
AiOptions determines which concrete client implementation is used for
communicating with the external AI service.
Provider integrations typically differ in authentication model, request format, endpoint structure, and supported model identifiers. The AI integration layer normalizes these differences so that the rest of the application can interact with a consistent abstraction.
Provider Selection
The selected provider influences:
- the HTTP endpoint used for inference requests
- authentication behavior
- the model identifier format
- response normalization logic
When AUTO is selected, the system attempts to determine the most
suitable provider automatically based on the configured endpoint or local
runtime environment.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from class java.lang.Enum
Enum.EnumDesc<E extends Enum<E>> -
Enum Constant Summary
Enum ConstantsEnum ConstantDescriptionUses the Anthropic API for AI inference, typically through models in the Claude family.Automatically selects the most appropriate AI provider based on configuration and runtime availability.Uses an Azure OpenAI Service deployment for AI inference.Uses the GitHub Models free inference service.Uses the Groq cloud inference service for AI-based security classification.Uses the Mistral AI API for inference.Uses a locally running Ollama instance as the AI inference backend.Uses the OpenAI API for AI inference.Uses the OpenRouter aggregation service to access multiple AI models through a unified API.Uses the xAI API to access Grok models. -
Method Summary
Modifier and TypeMethodDescriptionstatic AiProviderReturns the enum constant of this class with the specified name.static AiProvider[]values()Returns an array containing the constants of this enum class, in the order they are declared.
-
Enum Constant Details
-
AUTO
Automatically selects the most appropriate AI provider based on configuration and runtime availability.This mode allows the application to operate with minimal configuration, preferring locally available providers when possible.
-
OLLAMA
Uses a locally running Ollama instance as the AI inference backend.This provider typically communicates with an HTTP endpoint hosted on the local machine and allows the use of locally installed large language models without external API calls.
-
OPENAI
Uses the OpenAI API for AI inference.Requests are sent to the OpenAI platform using API key authentication and provider-specific model identifiers such as
gpt-4orgpt-4o. -
OPENROUTER
Uses the OpenRouter aggregation service to access multiple AI models through a unified API.OpenRouter acts as a routing layer that forwards requests to different underlying model providers while maintaining a consistent API surface.
-
ANTHROPIC
Uses the Anthropic API for AI inference, typically through models in the Claude family. -
AZURE_OPENAI
Uses an Azure OpenAI Service deployment for AI inference.Azure OpenAI is a managed cloud service operated by Microsoft inside a customer-controlled Azure tenant. Unlike the public OpenAI API, requests never leave the organization's Azure environment, making this provider suitable for regulated industries and corporate environments with data-sovereignty requirements.
Authentication uses a static resource-scoped API key supplied via the
api-keyHTTP header. Entra ID token-based authentication is not currently supported.The request endpoint is constructed from three configuration values:
baseUrl— the Azure OpenAI resource endpoint, for examplehttps://contoso.openai.azure.commodelName— the deployment name configured in the Azure portal (not the underlying model family name)apiVersion— the Azure OpenAI REST API version, for example2024-02-01
The resulting endpoint takes the form:
{baseUrl}/openai/deployments/{modelName}/chat/completions?api-version={apiVersion} -
GROQ
Uses the Groq cloud inference service for AI-based security classification.Groq exposes an OpenAI-compatible REST API served by custom LPU hardware, resulting in very low latency and high throughput. The endpoint is
https://api.groq.com/openai. A free tier is available at console.groq.com.Authentication uses a Bearer token supplied via the standard
Authorizationheader, identical to the OpenAI and OpenRouter providers. -
XAI
Uses the xAI API to access Grok models.The xAI platform exposes an OpenAI-compatible REST endpoint at
https://api.x.ai/v1. Authentication uses a Bearer token. API keys are available at console.x.ai. -
GITHUB_MODELS
Uses the GitHub Models free inference service.GitHub Models exposes an OpenAI-compatible endpoint at
https://models.inference.ai.azure.comand authenticates with a standard GitHub personal access token (GITHUB_TOKEN). The free tier is available to any GitHub account and covers a broad selection of models from multiple providers (OpenAI, Meta Llama, Mistral, and others).This provider is well-suited for CI pipelines in open-source projects where
GITHUB_TOKENis already available as an environment variable without additional secret management. -
MISTRAL
Uses the Mistral AI API for inference.Mistral exposes an OpenAI-compatible REST endpoint at
https://api.mistral.ai/v1. Authentication uses a Bearer token. A free tier is available at console.mistral.ai.
-
-
Method Details
-
values
Returns an array containing the constants of this enum class, in the order they are declared.- Returns:
- an array containing the constants of this enum class, in the order they are declared
-
valueOf
Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)- Parameters:
name- the name of the enum constant to be returned.- Returns:
- the enum constant with the specified name
- Throws:
IllegalArgumentException- if this enum class has no constant with the specified nameNullPointerException- if the argument is null
-