discourse-ai/lib/shared/inference
Sam 316ea9624e
FIX: properly truncate !command prompts (#227)
* FIX: properly truncate !command prompts

### What is going on here?

Previous to this change where a command was issued by the LLM it
could hallucinate a continuation eg:

```
This is what tags are

!tags

some nonsense here
```

This change introduces safeguards so `some nonsense here` does not
creep in to the prompt history, poisoning the llm results

This in effect grounds the llm a lot better and results in the llm
forgetting less about results.

The change only impacts Claude at the moment, but will also improve
stuff for llama 2 in future.

Also, this makes it significantly easier to test the bot framework
without an llm cause we avoid a whole bunch of complex stubbing

* blank is not a valid bot response, do not inject into prompt
2023-09-15 07:02:37 +10:00
..
anthropic_completions.rb FEATURE: Use stop_sequences for faster HyDE searches with Claude (#203) 2023-09-06 10:06:31 -03:00
discourse_classifier.rb FEATURE: Handle invalid media in NSFW module (#57) 2023-05-11 15:35:39 -03:00
discourse_reranker.rb DEV: DiscourseAI -> DiscourseAi rename to have consistent folders and files (#9) 2023-03-14 16:03:50 -03:00
function.rb FEATURE: add initial support for personas (#172) 2023-08-30 16:15:03 +10:00
function_list.rb FIX: properly truncate !command prompts (#227) 2023-09-15 07:02:37 +10:00
hugging_face_text_generation.rb FEATURE: HyDE-powered semantic search. (#136) 2023-09-05 11:08:23 -03:00
openai_completions.rb FIX: setting explorer was exceeding token budget 2023-09-01 11:48:51 +10:00
openai_embeddings.rb FEATURE: Add Azure cognitive service support (#93) 2023-06-21 10:39:51 +10:00
stability_generator.rb FEATURE: add support for final stable diffusion xl model (#122) 2023-08-02 16:53:28 -03:00