discourse-ai/lib/configuration/llm_dependency_validator.rb
Roman Rizzi 8849caf136
DEV: Transition "Select model" settings to only use LlmModels (#675)
We no longer support the "provider:model" format in the "ai_helper_model" and
"ai_embeddings_semantic_search_hyde_model" settings. We'll migrate existing
values and work with our new data-driven LLM configs from now on.
2024-06-19 18:01:35 -03:00

28 lines
632 B
Ruby

# frozen_string_literal: true
module DiscourseAi
module Configuration
class LlmDependencyValidator
def initialize(opts = {})
@opts = opts
end
def valid_value?(val)
return true if val == "f"
@llm_dependency_setting_name =
DiscourseAi::Configuration::LlmValidator.new.choose_llm_setting_for(@opts[:name])
SiteSetting.public_send(@llm_dependency_setting_name).present?
end
def error_message
I18n.t(
"discourse_ai.llm.configuration.set_llm_first",
setting: @llm_dependency_setting_name,
)
end
end
end
end