vLLM
LMStudio supports the OpenAI client, allowing you
to use the openai-generic
provider
with an overridden base_url
.
See https://lmstudio.ai/docs/local-server#make-an-inferencing-request-using-openais-chat-completions-format for more information.
BAML