Enum Class AiProvider

java.lang.Object
java.lang.Enum<AiProvider>
org.egothor.methodatlas.ai.AiProvider
All Implemented Interfaces:
Serializable, Comparable<AiProvider>, Constable

public enum AiProvider extends Enum<AiProvider>
Enumeration of supported AI provider implementations used by the AiSuggestionEngine.

Each constant represents a distinct AI platform capable of performing security classification of test sources. The provider selected through AiOptions determines which concrete client implementation is used for communicating with the external AI service.

Provider integrations typically differ in authentication model, request format, endpoint structure, and supported model identifiers. The AI integration layer normalizes these differences so that the rest of the application can interact with a consistent abstraction.

Provider Selection

The selected provider influences:

  • the HTTP endpoint used for inference requests
  • authentication behavior
  • the model identifier format
  • response normalization logic

When AUTO is selected, the system attempts to determine the most suitable provider automatically based on the configured endpoint or local runtime environment.

See Also:
  • Enum Constant Details

    • AUTO

      public static final AiProvider AUTO
      Automatically selects the most appropriate AI provider based on configuration and runtime availability.

      This mode allows the application to operate with minimal configuration, preferring locally available providers when possible.

    • OLLAMA

      public static final AiProvider OLLAMA
      Uses a locally running Ollama instance as the AI inference backend.

      This provider typically communicates with an HTTP endpoint hosted on the local machine and allows the use of locally installed large language models without external API calls.

    • OPENAI

      public static final AiProvider OPENAI
      Uses the OpenAI API for AI inference.

      Requests are sent to the OpenAI platform using API key authentication and provider-specific model identifiers such as gpt-4 or gpt-4o.

    • OPENROUTER

      public static final AiProvider OPENROUTER
      Uses the OpenRouter aggregation service to access multiple AI models through a unified API.

      OpenRouter acts as a routing layer that forwards requests to different underlying model providers while maintaining a consistent API surface.

    • ANTHROPIC

      public static final AiProvider ANTHROPIC
      Uses the Anthropic API for AI inference, typically through models in the Claude family.
    • AZURE_OPENAI

      public static final AiProvider AZURE_OPENAI
      Uses an Azure OpenAI Service deployment for AI inference.

      Azure OpenAI is a managed cloud service operated by Microsoft inside a customer-controlled Azure tenant. Unlike the public OpenAI API, requests never leave the organization's Azure environment, making this provider suitable for regulated industries and corporate environments with data-sovereignty requirements.

      Authentication uses a static resource-scoped API key supplied via the api-key HTTP header. Entra ID token-based authentication is not currently supported.

      The request endpoint is constructed from three configuration values:

      • baseUrl — the Azure OpenAI resource endpoint, for example https://contoso.openai.azure.com
      • modelName — the deployment name configured in the Azure portal (not the underlying model family name)
      • apiVersion — the Azure OpenAI REST API version, for example 2024-02-01

      The resulting endpoint takes the form:
      {baseUrl}/openai/deployments/{modelName}/chat/completions?api-version={apiVersion}

    • GROQ

      public static final AiProvider GROQ
      Uses the Groq cloud inference service for AI-based security classification.

      Groq exposes an OpenAI-compatible REST API served by custom LPU hardware, resulting in very low latency and high throughput. The endpoint is https://api.groq.com/openai. A free tier is available at console.groq.com.

      Authentication uses a Bearer token supplied via the standard Authorization header, identical to the OpenAI and OpenRouter providers.

    • XAI

      public static final AiProvider XAI
      Uses the xAI API to access Grok models.

      The xAI platform exposes an OpenAI-compatible REST endpoint at https://api.x.ai/v1. Authentication uses a Bearer token. API keys are available at console.x.ai.

    • GITHUB_MODELS

      public static final AiProvider GITHUB_MODELS
      Uses the GitHub Models free inference service.

      GitHub Models exposes an OpenAI-compatible endpoint at https://models.inference.ai.azure.com and authenticates with a standard GitHub personal access token (GITHUB_TOKEN). The free tier is available to any GitHub account and covers a broad selection of models from multiple providers (OpenAI, Meta Llama, Mistral, and others).

      This provider is well-suited for CI pipelines in open-source projects where GITHUB_TOKEN is already available as an environment variable without additional secret management.

    • MISTRAL

      public static final AiProvider MISTRAL
      Uses the Mistral AI API for inference.

      Mistral exposes an OpenAI-compatible REST endpoint at https://api.mistral.ai/v1. Authentication uses a Bearer token. A free tier is available at console.mistral.ai.

  • Method Details

    • values

      public static AiProvider[] values()
      Returns an array containing the constants of this enum class, in the order they are declared.
      Returns:
      an array containing the constants of this enum class, in the order they are declared
    • valueOf

      public static AiProvider valueOf(String name)
      Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)
      Parameters:
      name - the name of the enum constant to be returned.
      Returns:
      the enum constant with the specified name
      Throws:
      IllegalArgumentException - if this enum class has no constant with the specified name
      NullPointerException - if the argument is null