discourse-ai/lib/completions/tool_call_progress_tracker.rb
Sam 823e8ef490
FEATURE: partial tool call support for OpenAI and Anthropic (#908)
Implement streaming tool call implementation for Anthropic and Open AI.

When calling:

llm.generate(..., partial_tool_calls: true) do ...
Partials may contain ToolCall instances with partial: true, These tool calls are partially populated with json partially parsed.

So for example when performing a search you may get:

ToolCall(..., {search: "hello" })
ToolCall(..., {search: "hello world" })

The library used to parse json is:

https://github.com/dgraham/json-stream

We use a fork cause we need access to the internal buffer.

This prepares internals to perform partial tool calls, but does not implement it yet.
2024-11-14 06:58:24 +11:00

45 lines
1.1 KiB
Ruby

# frozen_string_literal: true
module DiscourseAi
module Completions
class ToolCallProgressTracker
attr_reader :current_key, :current_value, :tool_call
def initialize(tool_call)
@tool_call = tool_call
@current_key = nil
@current_value = nil
@parser = DiscourseAi::Completions::JsonStreamingParser.new
@parser.key do |k|
@current_key = k
@current_value = nil
end
@parser.value { |v| tool_call.notify_progress(@current_key, v) if @current_key }
end
def <<(json)
# llm could send broken json
# in that case just deal with it later
# don't stream
return if @broken
begin
@parser << json
rescue DiscourseAi::Completions::ParserError
@broken = true
return
end
if @parser.state == :start_string && @current_key
# this is is worth notifying
tool_call.notify_progress(@current_key, @parser.buf)
end
@current_key = nil if @parser.state == :end_value
end
end
end
end