Skip to content

Commit 143678e

Browse files
Csrayzhmellor
andauthored
Update docs/contributing/profiling.md
Co-authored-by: Harry Mellor <[email protected]>
1 parent 16a9d8e commit 143678e

File tree

1 file changed

+4
-7
lines changed

1 file changed

+4
-7
lines changed

docs/contributing/profiling.md

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,13 +7,10 @@
77

88
We support tracing vLLM workers using the `torch.profiler` module. You can enable tracing by setting the `VLLM_TORCH_PROFILER_DIR` environment variable to the directory where you want to save the traces: `VLLM_TORCH_PROFILER_DIR=/mnt/traces/`. Additionally, you can control the profiling content by specifying the following environment variables:
99

10-
`VLLM_TORCH_PROFILER_RECORD_SHAPES=1` to enable recording Tensor Shapes, off by default
11-
12-
`VLLM_TORCH_PROFILER_WITH_PROFILE_MEMORY=1` to record memory, off by default
13-
14-
`VLLM_TORCH_PROFILER_WITH_STACK=1` to enable recording stack information, on by default
15-
16-
`VLLM_TORCH_PROFILER_WITH_FLOPS=1` to enable recording FLOPs, off by default
10+
- `VLLM_TORCH_PROFILER_RECORD_SHAPES=1` to enable recording Tensor Shapes, off by default
11+
- `VLLM_TORCH_PROFILER_WITH_PROFILE_MEMORY=1` to record memory, off by default
12+
- `VLLM_TORCH_PROFILER_WITH_STACK=1` to enable recording stack information, on by default
13+
- `VLLM_TORCH_PROFILER_WITH_FLOPS=1` to enable recording FLOPs, off by default
1714

1815
The OpenAI server also needs to be started with the `VLLM_TORCH_PROFILER_DIR` environment variable set.
1916

0 commit comments

Comments
 (0)