Skip to content

VertexAI Embeddings: max_batch_size parameter not properly passed through configuration system #3199

@manuka99

Description

@manuka99

🐛 Describe the bug

Problem Description

When using VertexAI embeddings the max_batch_size parameter cannot be passed through to the underlying VertexAIEmbeddings constructor, causing batch size configuration to be ignored.

Current Behavior

  • Setting max_batch_size=1 in the embedder configuration does not affect the actual batch size used
  • The default _MAX_BATCH_SIZE = 250 is always used regardless of configuration
  • This causes issues when trying to reduce batch size for debugging or specific use cases

Expected Behavior

  • The max_batch_size parameter should be properly passed through the configuration chain
  • The VertexAIEmbeddings constructor should respect the configured batch size
  • Configuration should override the default _MAX_BATCH_SIZE = 250

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions