03fc94684b
* FIX: AI helper not working correctly with mixtral This PR introduces a new function on the generic llm called #generate This will replace the implementation of completion! #generate introduces a new way to pass temperature, max_tokens and stop_sequences Then LLM implementers need to implement #normalize_model_params to ensure the generic names match the LLM specific endpoint This also adds temperature and stop_sequences to completion_prompts this allows for much more robust completion prompts * port everything over to #generate * Fix translation - On anthropic this no longer throws random "This is your translation:" - On mixtral this actually works * fix markdown table generation as well |
||
---|---|---|
.. | ||
fabricators | ||
fixtures/embeddings | ||
jobs/regular | ||
lib | ||
models | ||
requests | ||
serializers | ||
shared | ||
support | ||
system | ||
plugin_spec.rb |