discourse-ai/lib/completions/endpoints
Roman Rizzi f622e2644f
FEATURE: Store provider-specific parameters. (#686)
Previously, we stored request parameters like the OpenAI organization and Bedrock's access key and region as site settings. This change stores them in the `llm_models` table instead, letting us drop more settings while also becoming more flexible.
2024-06-25 08:26:30 +10:00
..
anthropic.rb FEATURE: LLM presets for model creation (#681) 2024-06-21 17:32:15 +10:00
aws_bedrock.rb FEATURE: Store provider-specific parameters. (#686) 2024-06-25 08:26:30 +10:00
base.rb FIX: switch off native tools on Anthropic Claude Opus (#659) 2024-06-07 10:52:01 -03:00
canned_response.rb FEATURE: GPT4o support and better auditing (#618) 2024-05-14 13:28:46 +10:00
cohere.rb FEATURE: Add native Cohere tool support (#655) 2024-06-04 08:59:15 +10:00
fake.rb FEATURE: GPT4o support and better auditing (#618) 2024-05-14 13:28:46 +10:00
gemini.rb FIX: when creating an llm we were not creating user (#685) 2024-06-24 09:59:42 +10:00
hugging_face.rb FEATURE: Set endpoint credentials directly from LlmModel. (#625) 2024-05-16 09:50:22 -03:00
ollama.rb FEATURE: Set endpoint credentials directly from LlmModel. (#625) 2024-05-16 09:50:22 -03:00
open_ai.rb FEATURE: Store provider-specific parameters. (#686) 2024-06-25 08:26:30 +10:00
vllm.rb DEV: Rewire AI bot internals to use LlmModel (#638) 2024-06-18 14:32:14 -03:00