Skip to content
This repository was archived by the owner on Jan 27, 2026. It is now read-only.

Create Pulumi Component for Azure Resource Layer to Support vLLM #23

@kkiani

Description

@kkiani

Description:

Following the discussion on integrating LLM/Foundation Model AI applications, this issue focuses on building a Pulumi component to define the necessary Azure resources for deploying a vLLM model using Damavand. The goal is to simplify infrastructure setup for developers building AI applications on Azure, using resources like Azure Synapse or a VM backend to host vLLM models.

Key Tasks:

  • Implement the Pulumi component to provision Azure resources required for vLLM, such as VMs, storage accounts, and networking.
  • Ensure the component follows best security practices.

Resources:

  • Research Azure AI and other potential solutions for hosting vLLM models on Azure.
  • Explore open-source tools or tutorials that support AI model hosting on Azure infrastructure.

Acceptance Criteria:

  • A functional Pulumi component that provisions all necessary Azure infrastructure to support vLLM models.
  • The component provide config args for parametrizing the commponent.
  • The component should implement open/close principal by following the lazy loading pattern.
  • Documentation in the methods and classes.

Metadata

Metadata

Assignees

No one assigned

    Labels

    feature/integrationChanges and suggestion for currently integrated or under integration applicationintegration/openllm

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions