AI & models
Akka provides native integration with 9 LLM providers. You configure a model in application.conf and your Agents use it automatically. No additional libraries or adapters are required.
Built-in providers
| Provider | Site |
|---|---|
Anthropic |
|
OpenAI |
|
Google AI Gemini |
|
Google Cloud Vertex AI |
|
AWS Bedrock |
|
Hugging Face |
|
Ollama (local) |
|
LocalAI (local) |
Custom providers
You can plug in any model by implementing the ModelProvider.Custom interface. This involves the underlying LangChain4J ChatModel and optionally StreamingChatModel implementations.
See also
-
Agents — how Agents interact with AI models
-
Model provider details — full configuration reference for each provider