Class OllamaClient
- All Implemented Interfaces:
AiProviderClient
AiProviderClient implementation for a locally running
Ollama inference service.
This client submits taxonomy-guided classification prompts to the Ollama HTTP
API and converts the returned model response into the internal
AiClassSuggestion representation used by the MethodAtlas AI
subsystem.
Operational Responsibilities
- verifying local Ollama availability
- constructing chat-style inference requests
- injecting the system prompt and taxonomy-guided user prompt
- executing HTTP requests against the Ollama API
- extracting and normalizing JSON classification results
The client uses the Ollama /api/chat endpoint for inference and the
/api/tags endpoint as a lightweight availability probe.
This implementation is intended primarily for local, offline, or privacy-preserving inference scenarios where source code should not be sent to an external provider.
- See Also:
-
Constructor Summary
ConstructorsConstructorDescriptionOllamaClient(AiOptions options) Creates a new Ollama client using the supplied runtime configuration. -
Method Summary
Modifier and TypeMethodDescriptionbooleanDetermines whether the configured Ollama service is reachable.suggestForClass(String fqcn, String classSource, String taxonomyText, List<PromptBuilder.TargetMethod> targetMethods) Submits a classification request to the Ollama chat API for the specified test class.
-
Constructor Details
-
OllamaClient
Creates a new Ollama client using the supplied runtime configuration.The configuration determines the base URL of the Ollama service, the model identifier, and request timeout values used by this client.
- Parameters:
options- AI runtime configuration
-
-
Method Details
-
isAvailable
public boolean isAvailable()Determines whether the configured Ollama service is reachable.The method performs a lightweight availability probe against the
/api/tagsendpoint. If the endpoint responds successfully, the provider is considered available.Any exception raised during the probe is treated as an indication that the provider is unavailable.
- Specified by:
isAvailablein interfaceAiProviderClient- Returns:
trueif the Ollama service is reachable;falseotherwise
-
suggestForClass
public AiClassSuggestion suggestForClass(String fqcn, String classSource, String taxonomyText, List<PromptBuilder.TargetMethod> targetMethods) throws AiSuggestionException Submits a classification request to the Ollama chat API for the specified test class.The request consists of:
- a system prompt enforcing strict JSON output
- a user prompt containing the test class source and taxonomy text
- provider options such as deterministic temperature settings
The returned response is expected to contain a JSON object in the message content field. That JSON text is extracted, deserialized into an
AiClassSuggestion, and then normalized before being returned.- Specified by:
suggestForClassin interfaceAiProviderClient- Parameters:
fqcn- fully qualified class name being analyzedclassSource- complete source code of the class being analyzedtaxonomyText- taxonomy definition guiding classificationtargetMethods- deterministically extracted JUnit test methods that must be classified- Returns:
- normalized AI classification result
- Throws:
AiSuggestionException- if the request fails, if the provider returns invalid content, or if response deserialization fails- See Also:
-