discourse-ai/spec/lib/completions/endpoints
Sam 6623928b95
FIX: call after tool calls failing on OpenAI / Gemini (#599)
A recent change meant that llm instance got cached internally, repeat calls
to inference would cache data in Endpoint object leading model to
failures.

Both Gemini and Open AI expect a clean endpoint object cause they
set data.

This amends internals to make sure llm.generate will always operate
on clean objects
2024-05-01 17:50:58 +10:00
..
anthropic_spec.rb FEATURE: add Claude 3 Haiku bot support (#552) 2024-04-03 16:06:27 +11:00
aws_bedrock_spec.rb FEATURE: add Claude 3 sonnet/haiku support for Amazon Bedrock (#534) 2024-03-19 06:48:46 +11:00
cohere_spec.rb FEATURE: Cohere Command R support (#558) 2024-04-11 07:24:17 +10:00
endpoint_compliance.rb FIX: more robust function call support (#581) 2024-04-19 06:54:54 +10:00
gemini_spec.rb FEATURE: Add GitHub Helper AI Bot persona and tools (#513) 2024-03-08 06:37:23 +11:00
hugging_face_spec.rb DEV: Fix new Rubocop offenses 2024-03-06 15:23:29 +01:00
open_ai_spec.rb FIX: call after tool calls failing on OpenAI / Gemini (#599) 2024-05-01 17:50:58 +10:00
vllm_spec.rb DEV: Fix new Rubocop offenses 2024-03-06 15:23:29 +01:00