Roman Rizzi 8d5f901a67
DEV: Rewire AI bot internals to use LlmModel (#638)
* DRAFT: Create AI Bot users dynamically and support custom LlmModels

* Get user associated to llm_model

* Track enabled bots with attribute

* Don't store bot username. Minor touches to migrate default values in settings

* Handle scenario where vLLM uses a SRV record

* Made 3.5-turbo-16k the default version so we can remove hack
2024-06-18 14:32:14 -03:00

42 lines
1.1 KiB
Ruby

# frozen_string_literal: true
module DiscourseAi
module AiBot
class BotController < ::ApplicationController
requires_plugin ::DiscourseAi::PLUGIN_NAME
requires_login
def show_debug_info
post = Post.find(params[:post_id])
guardian.ensure_can_debug_ai_bot_conversation!(post)
posts =
Post
.where("post_number <= ?", post.post_number)
.where(topic_id: post.topic_id)
.order("post_number DESC")
debug_info = AiApiAuditLog.where(post: posts).order(created_at: :desc).first
render json: debug_info, status: 200
end
def stop_streaming_response
post = Post.find(params[:post_id])
guardian.ensure_can_see!(post)
Discourse.redis.del("gpt_cancel:#{post.id}")
render json: {}, status: 200
end
def show_bot_username
bot_user = DiscourseAi::AiBot::EntryPoint.find_user_from_model(params[:username])
raise Discourse::InvalidParameters.new(:username) if !bot_user
render json: { bot_username: bot_user.username_lower }, status: 200
end
end
end
end