Integrations¶
Olla supports various backends (endpoints) and front-ends integrations powered by Olla Profiles.
Backend Endpoints¶
Olla natively supports:
- Ollama - native support for Ollama, including model unification.
- LM Studio - native support for LM Studio, including model unification.
- vLLM - native support for vLLM, including model unification.
Other backends that support OpenAI APIs can be integrated too:
- OpenAI Compatibility - Provides a unified query API across all OpenAI backends.
Frontend Support¶
Profiles¶
Profiles provide an easy way to customise the behaviours of existing supported integrations (instead of writing Go code, compiling etc).
- You can customise existing behaviours
- Remove prefixes you don't use
- Add prefixes you would like to use instead
- You can extend existing functionality
- Add paths not supported to proxy through
- Change the model capability detection patterns
You can also create a custom profile to add new capabilities or backend support until native support is added.