discourse-ai/spec
Roman Rizzi f622e2644f
FEATURE: Store provider-specific parameters. (#686)
Previously, we stored request parameters like the OpenAI organization and Bedrock's access key and region as site settings. This change stores them in the `llm_models` table instead, letting us drop more settings while also becoming more flexible.
2024-06-25 08:26:30 +10:00
..
fabricators FEATURE: Configurable LLMs. (#606) 2024-05-13 12:46:42 -03:00
fixtures FIX: Load categories from search response (#612) 2024-05-14 17:13:25 +03:00
jobs DEV: Transition "Select model" settings to only use LlmModels (#675) 2024-06-19 18:01:35 -03:00
lib FIX: when creating an llm we were not creating user (#685) 2024-06-24 09:59:42 +10:00
models DEV: Rewire AI bot internals to use LlmModel (#638) 2024-06-18 14:32:14 -03:00
requests FEATURE: Store provider-specific parameters. (#686) 2024-06-25 08:26:30 +10:00
serializers DEV: Fix new Rubocop offenses 2024-03-06 15:23:29 +01:00
shared FEATURE: Set endpoint credentials directly from LlmModel. (#625) 2024-05-16 09:50:22 -03:00
support FIX: typo causing text_embedding_3_large to fail (#460) 2024-02-05 11:16:36 +11:00
system FIX: when creating an llm we were not creating user (#685) 2024-06-24 09:59:42 +10:00
tasks FIX: Filter soft-deleted topics when backfilling sentiment (#527) 2024-03-12 21:01:24 -03:00
plugin_helper.rb DEV: Transition "Select model" settings to only use LlmModels (#675) 2024-06-19 18:01:35 -03:00
plugin_spec.rb DEV: Transition "Select model" settings to only use LlmModels (#675) 2024-06-19 18:01:35 -03:00