Sam 7ca21cc329
FEATURE: first class support for OpenRouter (#1011)
* FEATURE: first class support for OpenRouter

This new implementation supports picking quantization and provider pref

Also:

- Improve logging for summary generation
- Improve error message when contacting LLMs fails

* Better support for full screen artifacts on iPad

Support back button to close full screen
2024-12-10 05:59:19 +11:00

43 lines
1.2 KiB
Ruby

# frozen_string_literal: true
module DiscourseAi
module Completions
module Endpoints
class OpenRouter < OpenAi
def self.can_contact?(model_provider)
%w[open_router].include?(model_provider)
end
def prepare_request(payload)
headers = { "Content-Type" => "application/json" }
api_key = llm_model.api_key
headers["Authorization"] = "Bearer #{api_key}"
headers["X-Title"] = "Discourse AI"
headers["HTTP-Referer"] = "https://www.discourse.org/ai"
Net::HTTP::Post.new(model_uri, headers).tap { |r| r.body = payload }
end
def prepare_payload(prompt, model_params, dialect)
payload = super
if quantizations = llm_model.provider_params["provider_quantizations"].presence
options = quantizations.split(",").map(&:strip)
payload[:provider] = { quantizations: options }
end
if order = llm_model.provider_params["provider_order"].presence
options = order.split(",").map(&:strip)
payload[:provider] ||= {}
payload[:provider][:order] = options
end
payload
end
end
end
end
end