mirror of
https://github.com/discourse/discourse-ai.git
synced 2025-03-06 17:30:20 +00:00
BAAI/bge-m3 is an interesting model, that is multilingual and with a context size of 8192. Even with a 16x larger context, it's only 4x slower to compute it's embeddings on the worst case scenario. Also includes a minor refactor of the rake task, including setting model and concurrency levels when running the backfill task.
32 lines
436 B
Markdown
32 lines
436 B
Markdown
## bert-base-uncased.json
|
|
|
|
Licensed under Apache License
|
|
|
|
## claude-v1-tokenization.json
|
|
|
|
Licensed under MIT License
|
|
|
|
## all-mpnet-base-v2.json
|
|
|
|
Licensed under Apache License
|
|
|
|
## llama-2-70b-chat-hf
|
|
|
|
Licensed under LLAMA 2 COMMUNITY LICENSE AGREEMENT
|
|
|
|
## multilingual-e5-large
|
|
|
|
Licensed under MIT License
|
|
|
|
## bge-large-en
|
|
|
|
Licensed under MIT License
|
|
|
|
## mixtral
|
|
|
|
Licensed under Apache 2.0 License
|
|
|
|
## bge-m3
|
|
|
|
Licensed under MIT License
|