Enum Class AiProvider
- All Implemented Interfaces:
Serializable,Comparable<AiProvider>,Constable
AiSuggestionEngine.
Each constant represents a distinct AI platform capable of performing
security classification of test sources. The provider selected through
AiOptions determines which concrete client implementation is used for
communicating with the external AI service.
Provider integrations typically differ in authentication model, request format, endpoint structure, and supported model identifiers. The AI integration layer normalizes these differences so that the rest of the application can interact with a consistent abstraction.
Provider Selection
The selected provider influences:
- the HTTP endpoint used for inference requests
- authentication behavior
- the model identifier format
- response normalization logic
When AUTO is selected, the system attempts to determine the most
suitable provider automatically based on the configured endpoint or local
runtime environment.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from class java.lang.Enum
Enum.EnumDesc<E extends Enum<E>> -
Enum Constant Summary
Enum ConstantsEnum ConstantDescriptionUses the Anthropic API for AI inference, typically through models in the Claude family.Automatically selects the most appropriate AI provider based on configuration and runtime availability.Uses a locally running Ollama instance as the AI inference backend.Uses the OpenAI API for AI inference.Uses the OpenRouter aggregation service to access multiple AI models through a unified API. -
Method Summary
Modifier and TypeMethodDescriptionstatic AiProviderReturns the enum constant of this class with the specified name.static AiProvider[]values()Returns an array containing the constants of this enum class, in the order they are declared.
-
Enum Constant Details
-
AUTO
Automatically selects the most appropriate AI provider based on configuration and runtime availability.This mode allows the application to operate with minimal configuration, preferring locally available providers when possible.
-
OLLAMA
Uses a locally running Ollama instance as the AI inference backend.This provider typically communicates with an HTTP endpoint hosted on the local machine and allows the use of locally installed large language models without external API calls.
-
OPENAI
Uses the OpenAI API for AI inference.Requests are sent to the OpenAI platform using API key authentication and provider-specific model identifiers such as
gpt-4orgpt-4o. -
OPENROUTER
Uses the OpenRouter aggregation service to access multiple AI models through a unified API.OpenRouter acts as a routing layer that forwards requests to different underlying model providers while maintaining a consistent API surface.
-
ANTHROPIC
Uses the Anthropic API for AI inference, typically through models in the Claude family.
-
-
Method Details
-
values
Returns an array containing the constants of this enum class, in the order they are declared.- Returns:
- an array containing the constants of this enum class, in the order they are declared
-
valueOf
Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)- Parameters:
name- the name of the enum constant to be returned.- Returns:
- the enum constant with the specified name
- Throws:
IllegalArgumentException- if this enum class has no constant with the specified nameNullPointerException- if the argument is null
-