discourse-ai/app/serializers/llm_model_serializer.rb
Roman Rizzi f622e2644f
FEATURE: Store provider-specific parameters. (#686)
Previously, we stored request parameters like the OpenAI organization and Bedrock's access key and region as site settings. This change stores them in the `llm_models` table instead, letting us drop more settings while also becoming more flexible.
2024-06-25 08:26:30 +10:00

24 lines
526 B
Ruby

# frozen_string_literal: true
class LlmModelSerializer < ApplicationSerializer
root "llm"
attributes :id,
:display_name,
:name,
:provider,
:max_prompt_tokens,
:tokenizer,
:api_key,
:url,
:enabled_chat_bot,
:shadowed_by_srv,
:provider_params
has_one :user, serializer: BasicUserSerializer, embed: :object
def shadowed_by_srv
object.url == LlmModel::RESERVED_VLLM_SRV_URL
end
end