FIX: Use vLLM if TGI is not configured for OSS LLM inference (#380)

This commit is contained in:
Rafael dos Santos Silva 2023-12-26 17:18:08 -03:00 committed by GitHub
parent 5db7bf6e68
commit 3c27cbfb9a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 1 additions and 1 deletions

View File

@ -12,7 +12,7 @@ module DiscourseAi
Llama2-chat-hf
mistralai/Mixtral-8x7B-Instruct-v0.1
mistralai/Mistral-7B-Instruct-v0.2
].include?(model_name)
].include?(model_name) && SiteSetting.ai_hugging_face_api_url.present?
end
def default_options