Discover OpenAI models here
Get an API key from the OpenAI Console
Configuration
config.yaml
Check out a more advanced configuration here
OpenAI API compatible providers
OpenAI API compatible providers include- KoboldCpp
- text-gen-webui
- FastChat
- LocalAI
- llama-cpp-python
- TensorRT-LLM
- vLLM
- BerriAI/litellm
- Tetrate Agent Router Service
apiBase
like this:
config.yaml
How to Force Legacy Completions Endpoint Usage
To force usage ofchat/completions
instead of completions
endpoint you can set:
config.yaml