Skip to content

[FEATURE REQ] Add streaming support for protocol methods in ChatClient #546

@shinexyt

Description

@shinexyt

Describe the feature or improvement you are requesting

Summary
The OpenAI .NET library currently provides protocol methods for direct access to the REST API with "binary in, binary out" functionality. However, these protocol methods only support non-streaming operations. This feature request proposes adding streaming support for protocol methods to enable compatibility with other LLM providers and custom request handling in streaming scenarios.

Proposed Solution
Add a streaming variant of the protocol method:

public virtual AsyncCollectionResult<BinaryData> CompleteChatStreamingAsync(BinaryContent content, CancellationToken cancellationToken = default)
{
    var options = cancellationToken.ToRequestOptions(streaming: true);

    return new AsyncSseUpdateCollection<BinaryData>(
      async () => await CompleteChatAsync(content, options).ConfigureAwait(false),
      (jsonElement, _) => BinaryData.FromString(jsonElement.GetRawText()),
      options?.CancellationToken ?? CancellationToken.None);
}

This method has been tested successfully with multiple major LLM providers including Gemini, Claude, DeepSeek, Qwen.

If this approach is suitable, I'm ready to submit a PR with the implementation.

Additional context

No response

Metadata

Metadata

Assignees

Labels

feature-requestCategory: A new feature or enhancement to an existing feature is being requested.issue-addressedWorkflow: The OpenAI maintainers believe the issue to be addressed and ready to close.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions