Rafael dos Santos Silva 102f47c1c4
FEATURE: Allow Anthropic inference via AWS Bedrock (#235)
If a module LLM model is set to claude-2 and the ai_bedrock variables are all present we will use AWS Bedrock instead of Antrhopic own APIs.

This is quite hacky, but will allow us to test the waters with AWS Bedrock early access with every module.

This situation of "same module, completely different API" is quite a bit far from what we had in the OpenAI/Azure separation, so it's more food for thought for when we start working on the LLM abstraction layer soon this year.
2023-10-02 12:58:36 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-02-17 11:33:47 -03:00
2023-09-04 15:46:35 -03:00
2023-07-15 00:56:15 +02:00

Discourse AI Plugin

Plugin Summary

For more information, please see: https://meta.discourse.org/t/discourse-ai/259214?u=falco

Languages
Ruby 79.5%
JavaScript 17%
SCSS 2%
CSS 0.6%
HTML 0.5%
Other 0.4%