The API endpoint URL for your chosen AI provider.
For Ollama, specify the base URL where your Ollama service is running:
http://localhost:11434http://YOUR_OLLAMA_HOST:11434Note: Ensure the Ollama service is running and accessible from your Jenkins instance.
Leave this field empty for standard OpenAI API usage.
Specify a custom URL only when using custom OpenAI proxies or enterprise endpoints.
Leave this field empty for standard Google AI usage.
Specify a custom URL only when using custom Gemini proxies or enterprise endpoints.
When left empty, LangChain4j automatically uses the official API endpoints. When specified, your custom URL will be used instead, enabling compatibility with alternative API providers that implement the same interface.