Skip to main contentConfiguration
When running OpenHands, you’ll need to set the following in the OpenHands UI through the Settings under the LLM tab:
Using OpenAI-Compatible Endpoints
Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoints. You can find their full documentation on this topic here.
Using an OpenAI Proxy
If you’re using an OpenAI proxy, in the OpenHands UI through the Settings under the LLM tab:
- Enable
Advanced options
- Set the following:
Custom Model to openai/<model-name> (e.g. openai/gpt-4o or openai/<proxy-prefix>/<model-name>)
Base URL to the URL of your OpenAI proxy
API Key to your OpenAI API key