The API endpoint URL for your chosen AI provider.

Ollama

For Ollama, specify the base URL where your Ollama service is running:

Note: Ensure the Ollama service is running and accessible from your Jenkins instance.

OpenAI

Leave this field empty for standard OpenAI API usage.

Specify a custom URL only when using custom OpenAI proxies or enterprise endpoints.

Google Gemini

Leave this field empty for standard Google AI usage.

Specify a custom URL only when using custom Gemini proxies or enterprise endpoints.

How it Works

When left empty, LangChain4j automatically uses the official API endpoints. When specified, your custom URL will be used instead, enabling compatibility with alternative API providers that implement the same interface.