discourse-ai/spec/lib/modules/ai_bot
Sam 316ea9624e
FIX: properly truncate !command prompts (#227)
* FIX: properly truncate !command prompts

### What is going on here?

Previous to this change where a command was issued by the LLM it
could hallucinate a continuation eg:

```
This is what tags are

!tags

some nonsense here
```

This change introduces safeguards so `some nonsense here` does not
creep in to the prompt history, poisoning the llm results

This in effect grounds the llm a lot better and results in the llm
forgetting less about results.

The change only impacts Claude at the moment, but will also improve
stuff for llama 2 in future.

Also, this makes it significantly easier to test the bot framework
without an llm cause we avoid a whole bunch of complex stubbing

* blank is not a valid bot response, do not inject into prompt
2023-09-15 07:02:37 +10:00
..
commands FEATURE: AI Helper endpoint to generate a thumbnail from text. (#224) 2023-09-14 12:53:44 -03:00
jobs/regular FIX: automatic bot titles missing sometime (#151) 2023-08-24 07:20:24 +10:00
personas FIX: Made bot more robust (#226) 2023-09-14 16:46:56 +10:00
anthropic_bot_spec.rb FIX: cut completion short after function call is found (#182) 2023-09-05 10:37:58 +10:00
bot_spec.rb FIX: properly truncate !command prompts (#227) 2023-09-15 07:02:37 +10:00
entry_point_spec.rb FEATURE: port to use claude-2 for chat bot (#114) 2023-07-27 11:24:44 +10:00
open_ai_bot_spec.rb DEV: Fix rspec-expectations warnings (#228) 2023-09-14 17:50:13 +02:00