Sam 03fc94684b
FIX: AI helper not working correctly with mixtral (#399)
* FIX: AI helper not working correctly with mixtral

This PR introduces a new function on the generic llm called #generate

This will replace the implementation of completion!

#generate introduces a new way to pass temperature, max_tokens and stop_sequences

Then LLM implementers need to implement #normalize_model_params to
ensure the generic names match the LLM specific endpoint

This also adds temperature and stop_sequences to completion_prompts
this allows for much more robust completion prompts

* port everything over to #generate

* Fix translation

- On anthropic this no longer throws random "This is your translation:"
- On mixtral this actually works

* fix markdown table generation as well
2024-01-04 09:53:47 -03:00
2023-02-17 11:33:47 -03:00
2023-12-26 14:49:55 -03:00
2023-02-17 11:33:47 -03:00
2023-11-03 11:30:09 +00:00
2023-11-03 11:30:09 +00:00
2023-11-29 23:01:48 +01:00
2023-02-17 11:33:47 -03:00
2023-11-29 23:01:48 +01:00
2023-09-04 15:46:35 -03:00
2023-07-15 00:56:15 +02:00
2023-11-29 23:01:48 +01:00

Discourse AI Plugin

Plugin Summary

For more information, please see: https://meta.discourse.org/t/discourse-ai/259214?u=falco

Languages
Ruby 79.6%
JavaScript 16.9%
SCSS 2.1%
CSS 0.5%
HTML 0.5%
Other 0.4%