This repository was archived by the owner on Jan 27, 2026. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 0
Create Pulumi Component for Azure Resource Layer to Support vLLM #23
Copy link
Copy link
Open
Labels
feature/integrationChanges and suggestion for currently integrated or under integration applicationChanges and suggestion for currently integrated or under integration applicationintegration/openllm
Description
Description:
Following the discussion on integrating LLM/Foundation Model AI applications, this issue focuses on building a Pulumi component to define the necessary Azure resources for deploying a vLLM model using Damavand. The goal is to simplify infrastructure setup for developers building AI applications on Azure, using resources like Azure Synapse or a VM backend to host vLLM models.
Key Tasks:
- Implement the Pulumi component to provision Azure resources required for vLLM, such as VMs, storage accounts, and networking.
- Ensure the component follows best security practices.
Resources:
- Research Azure AI and other potential solutions for hosting vLLM models on Azure.
- Explore open-source tools or tutorials that support AI model hosting on Azure infrastructure.
Acceptance Criteria:
- A functional Pulumi component that provisions all necessary Azure infrastructure to support vLLM models.
- The component provide config args for parametrizing the commponent.
- The component should implement open/close principal by following the lazy loading pattern.
- Documentation in the methods and classes.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
feature/integrationChanges and suggestion for currently integrated or under integration applicationChanges and suggestion for currently integrated or under integration applicationintegration/openllm