discourse-ai/spec/lib/completions
Sam 8df966e9c5
FEATURE: smooth streaming of AI responses on the client (#413)
This PR introduces 3 things:

1. Fake bot that can be used on local so you can test LLMs, to enable on dev use:

SiteSetting.ai_bot_enabled_chat_bots = "fake"

2. More elegant smooth streaming of progress on LLM completion

This leans on JavaScript to buffer and trickle llm results through. It also amends it so the progress dot is much 
more consistently rendered

3. It fixes the Claude dialect 

Claude needs newlines **exactly** at the right spot, amended so it is happy 

---------

Co-authored-by: Martin Brennan <martin@discourse.org>
2024-01-11 15:56:40 +11:00
..
dialects FEATURE: smooth streaming of AI responses on the client (#413) 2024-01-11 15:56:40 +11:00
endpoints FIX: Use claude-2.1 to enable system prompts (#411) 2024-01-09 14:10:20 -03:00
llm_spec.rb FEATURE: smooth streaming of AI responses on the client (#413) 2024-01-11 15:56:40 +11:00