Skip to content

Allow users to configure embed_batch_size or ThreadPoolExecutor size when calling Client.embed #534

@acompa

Description

@acompa

It looks like batching was added in #437 - thank you for implementing this, it's very helpful.

I notice that batching, as defined here, depends on a fixed batch size. This can be suboptimal for clients submitting a large number of smaller documents, as we cannot configure the ThreadPoolExecutor size to parallelize a large number of small data payloads. As a result a client might end up blocking while waiting for small network responses.

Would it be possible to allow clients to configure either the ThreadPoolExecutor size or the embed_batch_size setting when calling embed?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions