Logo
Explore Help
Sign In
discourse/discourse-ai
1
0
Fork 0
You've already forked discourse-ai
mirror of https://github.com/discourse/discourse-ai.git synced 2025-07-01 20:12:15 +00:00
Code Issues Packages Projects Releases Wiki Activity
discourse-ai/spec/fabricators/llm_quota_fabricator.rb

10 lines
157 B
Ruby
Raw Permalink Normal View History

FEATURE: llm quotas (#1047) Adds a comprehensive quota management system for LLM models that allows: - Setting per-group (applied per user in the group) token and usage limits with configurable durations - Tracking and enforcing token/usage limits across user groups - Quota reset periods (hourly, daily, weekly, or custom) - Admin UI for managing quotas with real-time updates This system provides granular control over LLM API usage by allowing admins to define limits on both total tokens and number of requests per group. Supports multiple concurrent quotas per model and automatically handles quota resets. Co-authored-by: Keegan George <kgeorge13@gmail.com>
2025-01-14 15:54:09 +11:00
# frozen_string_literal: true
Fabricator(:llm_quota) do
group
llm_model
max_tokens { 1000 }
max_usages { 10 }
duration_seconds { 1.day.to_i }
end
Reference in New Issue Copy Permalink
Powered by Gitea Version: 1.23.8 Page: 82ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API