diff --git a/docs/install/3rd-party/dgl-install.rst b/docs/install/3rd-party/dgl-install.rst
index aeffe607..ef3fbf03 100644
--- a/docs/install/3rd-party/dgl-install.rst
+++ b/docs/install/3rd-party/dgl-install.rst
@@ -1,26 +1,22 @@
.. meta::
:description: Install Deep Graph Library (DGL) on ROCm
- :keywords: installation, docker, DGL, AMD, ROCm
+ :keywords: installation, docker, DGL, deep learning, AMD, ROCm
********************************************************************************
-DGL on ROCm
+DGL on ROCm installation
********************************************************************************
Deep Graph Library `(DGL) `_ is an easy-to-use, high-performance and scalable
-Python package for deep learning on graphs. DGL is framework agnostic, meaning
-if a deep graph model is a component in an end-to-end application, the rest of
-the logic is implemented using PyTorch.
+Python package for deep learning on graphs.
-.. |br| raw:: html
+This topic covers setup instructions and the necessary files to build, test, and run
+DGL with ROCm support in a Docker environment. To learn more about DGL on ROCm,
+including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/dgl-compatibility`.
-
+.. note::
-For hardware, software, and third-party framework compatibility between ROCm and DGL,
-see the following resources:
-
-* :ref:`system-requirements`
-
-* :doc:`rocm:compatibility/ml-compatibility/dgl-compatibility`
+ DGL is supported on ROCm 6.4.0.
Install DGL
================================================================================
@@ -73,7 +69,7 @@ Docker image support
AMD validates and publishes ready-made `DGL Docker images `_
with ROCm backends on Docker Hub. The following Docker image tags and associated inventories are
-validated for ROCm 6.4.
+validated for ROCm 6.4.0.
.. tab-set::
diff --git a/docs/install/3rd-party/flashinfer-install.rst b/docs/install/3rd-party/flashinfer-install.rst
index d0f45734..8db04117 100644
--- a/docs/install/3rd-party/flashinfer-install.rst
+++ b/docs/install/3rd-party/flashinfer-install.rst
@@ -1,6 +1,6 @@
.. meta::
- :description: Installing FlashInfer for ROCm
- :keywords: installation, docker, FlashInfer, AMD, ROCm
+ :description: Install FlashInfer on ROCm
+ :keywords: installation, docker, FlashInfer, deep learning, AMD, ROCm
********************************************************************************
FlashInfer on ROCm installation
@@ -10,7 +10,8 @@ FlashInfer on ROCm installation
for Large Language Models (LLMs) that provides high-performance implementation of graphics
processing units (GPUs) kernels.
-This topic covers installation. To learn more about FlashInfer on ROCm,
+This topic covers setup instructions and the necessary files to build, test, and run
+FlashInfer with ROCm support in a Docker environment. To learn more about FlashInfer on ROCm,
including its use cases, recommendations, as well as hardware and software compatibility,
see :doc:`rocm:compatibility/ml-compatibility/flashinfer-compatibility`.
diff --git a/docs/install/3rd-party/jax-install.rst b/docs/install/3rd-party/jax-install.rst
index e8e4c5f9..d8c1be1e 100644
--- a/docs/install/3rd-party/jax-install.rst
+++ b/docs/install/3rd-party/jax-install.rst
@@ -1,23 +1,36 @@
.. meta::
- :description: JAX on ROCm
- :keywords: installation instructions, building, JAX, AMD, ROCm
+ :description: Install JAX on ROCm
+ :keywords: installation, docker, JAX, deep learning, AMD, ROCm
-***********
-JAX on ROCm
-***********
+*************************************************************************************
+JAX on ROCm installation
+*************************************************************************************
-This directory provides setup instructions and necessary files to build, test, and run JAX with ROCm support in a Docker environment, suitable for both runtime and CI workflows. Explore the following methods to use or build JAX on ROCm.
+`JAX `__ is a library
+for array-oriented numerical computation (similar to NumPy), with automatic differentiation
+and just-in-time (JIT) compilation to enable high-performance machine learning research.
-For hardware, software, and third-party framework compatibility between ROCm and JAX, see the following resources:
+This topic covers setup instructions and the necessary files to build, test, and run
+JAX with ROCm support in a Docker environment. To learn more about JAX on ROCm,
+including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/jax-compatibility`.
-* :ref:`system-requirements`
+Install JAX on ROCm
+======================================================================================
-* :doc:`rocm:compatibility/ml-compatibility/jax-compatibility`
+To install JAX on ROCm, you have the following options:
-Using a prebuilt Docker image
-===========================================
+* :ref:`using-docker-with-jax-pre-installed` **(recommended)**
+* :ref:`build-jax-rocm-docker-image`
+* :ref:`install-jax-rocm-custom-container`
+* :ref:`build-jax-from-source`
-The ROCm JAX team provides prebuilt Docker images, which is the simplest way to use JAX on ROCm. These images are available on Docker Hub and come with JAX configured for ROCm.
+.. _using-docker-with-jax-pre-installed:
+
+Use a prebuilt Docker image with JAX preinstalled
+--------------------------------------------------------------------------------------
+The ROCm JAX team provides prebuilt Docker images, which is the simplest way to use JAX on ROCm.
+These images are available on Docker Hub and come with JAX configured for ROCm.
1. To pull the latest ROCm JAX Docker image, run:
@@ -29,7 +42,7 @@ The ROCm JAX team provides prebuilt Docker images, which is the simplest way to
For specific versions of JAX, review the periodically pushed Docker images at `ROCm JAX on
Docker Hub `_.
-
+
2. Once the image is downloaded, launch a container using the following command:
.. code-block:: bash
@@ -51,13 +64,13 @@ The ROCm JAX team provides prebuilt Docker images, which is the simplest way to
* The ``--shm-size`` parameter allocates shared memory for the container. Adjust it based on your system's resources if needed.
* Replace ``$(pwd)`` with the absolute path to the directory you want to mount inside the container.
-
+
3. Verify the installation of ROCm JAX. See :ref:`jax-verify-installation`.
.. _jax-docker-support:
Docker image support
---------------------
+--------------------------------------------------------------------------------------
AMD validates and publishes ready-made JAX images with ROCm backends on Docker
Hub. The following Docker image tags and associated inventories are validated
@@ -97,8 +110,10 @@ For ``jax-community`` images, see `rocm/jax-community `__.
-Using a ROCm base Docker image and installing JAX
-=================================================
+.. _build-jax-rocm-docker-image:
+
+Use a ROCm base Docker image to install JAX
+--------------------------------------------------------------------------------------
If you prefer to use the ROCm Ubuntu image or already have a ROCm Ubuntu container, follow these steps to install JAX in the container.
@@ -154,8 +169,10 @@ If you prefer to use the ROCm Ubuntu image or already have a ROCm Ubuntu contain
6. Verify the installation of ROCm JAX. See :ref:`jax-verify-installation`.
+.. _install-jax-rocm-custom-container:
+
Install JAX on bare-metal or a custom container
-===============================================
+--------------------------------------------------------------------------------------
Follow these steps if you prefer to install ROCm manually on your host system or in a custom container.
@@ -230,8 +247,10 @@ Follow these steps if you prefer to install ROCm manually on your host system or
[0 1 2 3 4]
-Build ROCm JAX from source
-==========================
+.. _build-jax-from-source:
+
+Build JAX from source
+--------------------------------------------------------------------------------------
Follow these steps to build JAX with ROCm support from source.
@@ -280,16 +299,15 @@ Follow these steps to build JAX with ROCm support from source.
source .venv/bin/activate
python3 setup.py develop --user && pip3 -m pip install dist/*.whl
-Simplified build script
------------------------
+.. tip::
-For a streamlined build process, consider using the ``jax/build/rocm/dev_build_rocm.py`` script. See
-``__ for more information.
+ For a streamlined build process, consider using the ``jax/build/rocm/dev_build_rocm.py`` script. See
+ ``__ for more information.
.. _jax-verify-installation:
-Testing your JAX installation with ROCm
-=======================================
+Test the JAX installation
+======================================================================================
After launching the container, test whether JAX detects ROCm devices as expected:
diff --git a/docs/install/3rd-party/llama-cpp-install.rst b/docs/install/3rd-party/llama-cpp-install.rst
index 26f18468..9e160071 100644
--- a/docs/install/3rd-party/llama-cpp-install.rst
+++ b/docs/install/3rd-party/llama-cpp-install.rst
@@ -1,6 +1,6 @@
.. meta::
- :description: Installing llama.cpp for ROCm
- :keywords: installation instructions, llama.cpp, AMD, ROCm, GGML
+ :description: Install llama.cpp on ROCm
+ :keywords: installation, llama.cpp, docker, deep learning, AMD, ROCm, GGML
********************************************************************************
llama.cpp on ROCm installation
@@ -8,10 +8,10 @@ llama.cpp on ROCm installation
`llama.cpp `__ is an open-source framework
for Large Language Model (LLM) inference that runs on both central processing units
-(CPUs) and graphics processing units (GPUs). It is written in plain C/C++, providing
-a simple, dependency-free setup.
+(CPUs) and graphics processing units (GPUs).
-This topic covers installation. To learn more about llama.cpp on ROCm,
+This topic covers setup instructions and the necessary files to build, test, and run
+llama.cpp with ROCm support in a Docker environment. To learn more about llama.cpp on ROCm,
including its use cases, recommendations, as well as hardware and software compatibility,
see :doc:`rocm:compatibility/ml-compatibility/llama-cpp-compatibility`.
diff --git a/docs/install/3rd-party/megablocks-install.rst b/docs/install/3rd-party/megablocks-install.rst
index b55dd8aa..de4fa9bc 100644
--- a/docs/install/3rd-party/megablocks-install.rst
+++ b/docs/install/3rd-party/megablocks-install.rst
@@ -1,22 +1,18 @@
.. meta::
:description: Install Megablocks on ROCm
- :keywords: installation, docker, Megablocks, AMD, ROCm
+ :keywords: installation, docker, Megablocks, deep learning, AMD, ROCm
********************************************************************************
-Megablocks on ROCm
+Megablocks on ROCm installation
********************************************************************************
-Megablocks is a light-weight library for mixture-of-experts (MoE) training.
-The core of the system is efficient "dropless-MoE" and standard MoE layers.
-Megablocks is integrated with `https://github.com/stanford-futuredata/Megatron-LM `_,
-where data and pipeline parallel training of MoEs is supported.
+`Megablocks `__ is a light-weight library
+for mixture-of-experts `(MoE) `__ training.
-
-For hardware, software, and third-party framework compatibility between ROCm and Megablocks,
-see the following resources:
-
-* :ref:`system-requirements`
-* :doc:`rocm:compatibility/ml-compatibility/megablocks-compatibility`
+This topic covers setup instructions and the necessary files to build, test, and run
+Megablocks with ROCm support in a Docker environment. To learn more about Megablocks on ROCm,
+including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/megablocks-compatibility`.
.. note::
@@ -30,16 +26,14 @@ To install Megablocks on ROCm, you have the following options:
* :ref:`using-docker-with-megablocks-pre-installed` **(recommended)**
* :ref:`build-megablocks-rocm-docker-image`
-
.. _using-docker-with-megablocks-pre-installed:
-Using a prebuilt Docker image with Megablocks pre-installed
+Use a prebuilt Docker image with Megablocks pre-installed
--------------------------------------------------------------------------------------
Docker is the recommended method to set up a Megablocks environment, and it avoids potential installation issues.
The tested, prebuilt image includes Megablocks, PyTorch, ROCm, and other dependencies.
-
1. Pull the Docker image
.. code-block:: bash
@@ -60,7 +54,6 @@ The tested, prebuilt image includes Megablocks, PyTorch, ROCm, and other depende
--ipc=host --shm-size 16G \
rocm/megablocks:megablocks-0.7.0_rocm6.3.0_ubuntu24.04_py3.12_pytorch2.4.0
-
.. _build-megablocks-rocm-docker-image:
Build your own Docker image
@@ -80,14 +73,13 @@ A Dockerfile is provided in the `https://github.com/ROCm/megablocks `__ is an open-source tensor library designed for deep learning. PyTorch on
-ROCm provides mixed-precision and large-scale training using our
-`MIOpen `_ and
-`RCCL `_ libraries.
+`PyTorch `__ is an open-source tensor library designed for deep learning.
+PyTorch on ROCm provides mixed-precision and large-scale training using AMD `MIOpen `_
+and `RCCL `_ libraries.
+
+This topic covers setup instructions and the necessary files to build, test, and run
+PyTorch with ROCm support in a Docker environment. To learn more about PyTorch on ROCm,
+including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/pytorch-compatibility`.
+
+Install PyTorch
+======================================================================================
To install PyTorch for ROCm, you have the following options:
-* :ref:`using-docker-with-pytorch-pre-installed` (recommended)
+* :ref:`using-docker-with-pytorch-pre-installed` **(recommended)**
* :ref:`pytorch-docker-support`
@@ -21,28 +28,14 @@ To install PyTorch for ROCm, you have the following options:
* :ref:`using-pytorch-upstream-docker-image`
-.. |br| raw:: html
-
-
-
-For hardware, software, and third-party framework compatibility between ROCm and PyTorch, see the following resources:
-
-* :ref:`system-requirements`
-
-* :doc:`rocm:compatibility/ml-compatibility/pytorch-compatibility`
-
.. _using-docker-with-pytorch-pre-installed:
-Using a Docker image with PyTorch pre-installed
-===============================================
-
-To install ROCm on bare metal, follow :doc:`/install/install-overview`. The recommended option to
-get a PyTorch environment is through Docker.
+Use a prebuilt Docker image with PyTorch pre-installed
+--------------------------------------------------------------------------------------
-Using Docker provides portability and access to a prebuilt Docker image that
-has been rigorously tested within AMD. This can also save compilation time and
-should perform as tested and mitigate potential installation issues. See
-:ref:`pytorch-docker-support`
+The recommended setup to get a PyTorch environment is through Docker, as it avoids potential installation issues.
+The tested, prebuilt image includes PyTorch, ROCm, and other dependencies. See :ref:`pytorch-docker-support`.
+To install ROCm on bare metal, follow :doc:`/install/install-overview`.
1. Download the latest public `PyTorch Docker image `_.
@@ -85,7 +78,7 @@ should perform as tested and mitigate potential installation issues. See
.. _pytorch-docker-support:
Docker image support
---------------------
+--------------------------------------------------------------------------------------
AMD validates and publishes ready-made `PyTorch `_ images
with ROCm backends on Docker Hub. The following Docker image tags and associated inventories are
@@ -288,8 +281,8 @@ validated for ROCm 7.0.0.
.. _install_pytorch_wheels:
.. _using-wheels-package:
-Using a wheels package
-======================
+Use a wheels package
+--------------------------------------------------------------------------------------
PyTorch supports the ROCm platform by providing tested wheels packages. To access this feature, go
to `pytorch.org/get-started/locally/ `_. For the correct
@@ -405,8 +398,8 @@ wheels command, you must select **Linux**, **Python**, **pip**, and **ROCm** in
.. _using-pytorch-rocm-docker-image:
.. _building-pytorch-from-source:
-Building your own PyTorch from source
-=====================================
+Build PyTorch from source
+--------------------------------------------------------------------------------------
Use the ``rocm/pytorch:latest`` image, uninstall the preinstalled PyTorch
package, and rebuild PyTorch from source. This ensures compatibility with your
@@ -473,8 +466,8 @@ specific ROCm version, GPU architecture, and project requirements.
.. _using-pytorch-upstream-docker-image:
-Using the PyTorch upstream Dockerfile
-=====================================
+Use the PyTorch upstream Dockerfile
+--------------------------------------------------------------------------------------
If you don't want to use a prebuilt base Docker image, you can build a custom base Docker image
using scripts from the PyTorch repository. This uses a standard Docker image from operating system
@@ -578,8 +571,8 @@ maintainers and installs all the required dependencies, including:
.. _test-pytorch-installation:
-Testing the PyTorch installation
-================================
+Test the PyTorch installation
+======================================================================================
You can use PyTorch unit tests to validate your PyTorch installation. If you used a
**prebuilt PyTorch Docker image from AMD ROCm Docker Hub** or installed an
@@ -639,8 +632,8 @@ If you want to manually run unit tests to validate your PyTorch installation ful
You can replace ``test_nn.py`` with any other test set.
-Running a basic PyTorch example
-===============================
+Run a PyTorch example
+======================================================================================
The PyTorch examples repository provides basic examples that exercise the functionality of your
framework.
@@ -653,7 +646,7 @@ Two of our favorite testing databases are:
**visual object recognition**.
MNIST PyTorch example
----------------------
+--------------------------------------------------------------------------------------
1. Clone the PyTorch examples repository.
@@ -685,7 +678,7 @@ MNIST PyTorch example
Test set: Average loss: 0.0252, Accuracy: 9921/10000 (99%)
ImageNet PyTorch example
----------------------------------------------------------------------------------------------------------
+-----------------------------------------------------------------------------------------
1. Clone the PyTorch examples repository (if you didn't already do this in the preceding MNIST
example).
@@ -709,7 +702,7 @@ ImageNet PyTorch example
.. _troubleshooting-pytorch:
Troubleshooting
-===============
+======================================================================================
* What to do if you get the following error when trying to run PyTorch:
diff --git a/docs/install/3rd-party/ray-install.rst b/docs/install/3rd-party/ray-install.rst
index 69bc64a6..53867e13 100644
--- a/docs/install/3rd-party/ray-install.rst
+++ b/docs/install/3rd-party/ray-install.rst
@@ -1,25 +1,20 @@
.. meta::
- :description: Ray on ROCm
- :keywords: installation instructions, building, Ray, AMD, ROCm
+ :description: Install Ray on ROCm
+ :keywords: installation, docker, deep learning, Ray, AMD, ROCm
********************************************************************************
Ray on ROCm installation
********************************************************************************
-Ray is a unified framework for scaling AI and Python applications from your laptop
-to a full cluster, without changing your code. Ray consists of `a core distributed
+Ray is a unified framework, consisting of `a core distributed
runtime `_ and a set of
`AI libraries `_ for
simplifying machine learning computations.
-Ray is a general-purpose framework that runs many types of workloads efficiently.
-Any Python application can be scaled with Ray, without extra infrastructure.
-
-For hardware, software, and third-party framework compatibility between ROCm and Ray,
-see the following resources:
-
-* :ref:`system-requirements`
-* :doc:`rocm:compatibility/ml-compatibility/ray-compatibility`
+This topic covers setup instructions and the necessary files to build, test, and run
+Ray with ROCm support in a Docker environment. To learn more about Ray on ROCm,
+including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/ray-compatibility`.
.. note::
@@ -37,7 +32,7 @@ To install Ray on ROCm, you have the following options:
.. _using-docker-with-ray-pre-installed:
-Using a prebuilt Docker image with Ray pre-installed
+Use a prebuilt Docker image with Ray pre-installed
--------------------------------------------------------------------------------------
Docker is the recommended method to set up a Ray environment, and it avoids potential installation issues.
@@ -126,7 +121,7 @@ If you prefer to use the ROCm Ubuntu image or already have a ROCm Ubuntu contain
.. _install-rocm-ray-bare-metal:
-Install Ray on bare metal or a custom container
+Install Ray on bare-metal or a custom container
--------------------------------------------------------------------------------------
Follow these steps if you prefer to install ROCm manually on your host system or in a custom container.
diff --git a/docs/install/3rd-party/stanford-megatron-lm-install.rst b/docs/install/3rd-party/stanford-megatron-lm-install.rst
index 508e7da5..fc85b45c 100644
--- a/docs/install/3rd-party/stanford-megatron-lm-install.rst
+++ b/docs/install/3rd-party/stanford-megatron-lm-install.rst
@@ -1,18 +1,19 @@
.. meta::
:description: Install Stanford Megatron-LM on ROCm
- :keywords: installation, docker, Megatron-LM, AMD, ROCm
+ :keywords: installation, docker, Megatron-LM, deep learning, AMD, ROCm
********************************************************************************
-Stanford Megatron-LM on ROCm
+Stanford Megatron-LM on ROCm installation
********************************************************************************
-Stanford Megatron-LM is a large-scale language model training framework developed by `NVIDIA `_. It is
-designed to train massive transformer-based language models efficiently by model and data parallelism.
+`Stanford Megatron-LM `__ is a large-scale
+language model (LLM) training framework designed to train massive transformer-based models
+efficiently through data parallelism.
-For hardware, software, and third-party framework compatibility between ROCm and Stanford-Megatron-LM, see:
-
-* :ref:`system-requirements`
-* :doc:`rocm:compatibility/ml-compatibility/stanford-megatron-lm-compatibility`
+This topic covers setup instructions and the necessary files to build, test, and run
+Stanford Megatron-LM with ROCm support in a Docker environment. To learn more about Stanford Megatron-LM
+on ROCm, including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/stanford-megatron-lm-compatibility`.
.. note::
@@ -34,7 +35,7 @@ Use a prebuilt Docker image with Stanford Megatron-LM pre-installed
The recommended way to set up a Stanford Megatron-LM environment and avoid potential installation issues is with Docker.
The tested, prebuilt image includes Stanford Megatron-LM, PyTorch, ROCm, and other dependencies.
-Prebuilt Docker images with Stanford Megatron-LM configured for ROCm 6.3.0 are available on `Docker Hub `_.
+Prebuilt Docker images with Stanford Megatron-LM configured for ROCm 6.3.0 are available on `Docker Hub `_.
1. Pull the Docker image:
diff --git a/docs/install/3rd-party/taichi-install.rst b/docs/install/3rd-party/taichi-install.rst
index 9616883f..f3bf35be 100644
--- a/docs/install/3rd-party/taichi-install.rst
+++ b/docs/install/3rd-party/taichi-install.rst
@@ -1,21 +1,18 @@
.. meta::
:description: Install Taichi on ROCm
- :keywords: installation, docker, Taichi, AMD, ROCm
+ :keywords: installation, docker, Taichi, deep learning, AMD, ROCm
********************************************************************************
-Taichi on ROCm
+Taichi on ROCm installation
********************************************************************************
`Taichi `_ is an open-source, imperative, and parallel
programming language designed for high-performance numerical computation.
-Embedded in Python, it leverages just-in-time (JIT) compilation frameworks such as LLVM to accelerate
-compute-intensive Python code by compiling it to native GPU or CPU instructions.
-For hardware, software, and third-party framework compatibility between ROCm and Taichi,
-see the following resources:
-
-* :ref:`system-requirements`
-* :doc:`rocm:compatibility/ml-compatibility/taichi-compatibility`
+This topic covers setup instructions and the necessary files to build, test, and run
+Taichi with ROCm support in a Docker environment. To learn more about Taichi
+on ROCm, including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/taichi-compatibility`.
.. note::
diff --git a/docs/install/3rd-party/tensorflow-install.rst b/docs/install/3rd-party/tensorflow-install.rst
index 2e955b27..6f2c80bd 100644
--- a/docs/install/3rd-party/tensorflow-install.rst
+++ b/docs/install/3rd-party/tensorflow-install.rst
@@ -1,36 +1,23 @@
.. meta::
- :description: Installing TensorFlow for ROCm
- :keywords: installation instructions, TensorFlow, AMD, ROCm
+ :description: Install TensorFlow on ROCm
+ :keywords: installation, docker, TensorFlow, deep learning, AMD, ROCm
-******************
-TensorFlow on ROCm
-******************
+********************************************************************************
+TensorFlow on ROCm installation
+********************************************************************************
`TensorFlow `__ is an open-source library for solving machine learning,
-deep learning, and AI problems. It can solve many
-problems across different sectors and industries, but primarily focuses on
-neural network training and inference. It is one of the most popular and
-in-demand frameworks and is very active in open-source contribution and
-development.
+deep learning, and AI problems.
-To install TensorFlow for ROCm, you have the following options:
-
-* :ref:`install-tensorflow-prebuilt-docker` (recommended)
-
- * :ref:`tensorflow-docker-support`
-
-* :ref:`install-tensorflow-wheels`
-
-For hardware, software, and third-party framework compatibility between ROCm and TensorFlow, see the following resources:
-
-* :ref:`system-requirements`
-
-* :doc:`rocm:compatibility/ml-compatibility/tensorflow-compatibility`
+This topic covers setup instructions and the necessary files to build, test, and run
+TensorFlow with ROCm support in a Docker environment. To learn more about TensorFlow
+on ROCm, including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/tensorflow-compatibility`.
.. note::
- As of ROCm 6.1, ``tensorflow-rocm`` packages are found at ``__.
- Prior to ROCm 6.1, packages were found at ``_.
+ As of ROCm 6.1.0, ``tensorflow-rocm`` packages are found at ``__.
+ Prior to ROCm 6.1.0, packages were found at ``_.
.. _install-tensorflow-versions:
@@ -51,12 +38,23 @@ For hardware, software, and third-party framework compatibility between ROCm and
* - 6.0.x
- 2.14.0, 2.13.1 2.12.1
-.. _install-tensorflow-prebuilt-docker:
-
.. _install-tensorflow-options:
-Using a Docker image with TensorFlow pre-installed
-==================================================
+Install TensorFlow
+================================================================================
+
+To install TensorFlow for ROCm, you have the following options:
+
+* :ref:`install-tensorflow-prebuilt-docker` **(recommended)**
+
+ * :ref:`tensorflow-docker-support`
+
+* :ref:`install-tensorflow-wheels`
+
+.. _install-tensorflow-prebuilt-docker:
+
+Use a prebuilt Docker image with TensorFlow pre-installed
+--------------------------------------------------------------------------------
To install ROCm on bare metal, follow
:doc:`/install/install-overview`. The recommended option to
@@ -90,14 +88,10 @@ Follow these steps:
--security-opt seccomp=unconfined \
rocm/tensorflow:latest
-.. |hr| raw:: html
-
-
-
.. _tensorflow-docker-support:
Docker image support
---------------------
+--------------------------------------------------------------------------------
AMD validates and publishes ready-made TensorFlow images with ROCm backends on
Docker Hub. The following Docker image tags and associated inventories are
@@ -264,8 +258,8 @@ validated for ROCm 7.0.0.
.. _install-tensorflow-wheels:
-Using a wheels package
-======================
+Use a wheels package
+--------------------------------------------------------------------------------
To install TensorFlow using the wheels package, use the following command.
@@ -284,8 +278,8 @@ To install TensorFlow using the wheels package, use the following command.
.. _test-tensorflow-installation:
-Testing the TensorFlow installation
-===================================
+Test the TensorFlow installation
+================================================================================
To test the installation of TensorFlow, run the container as specified in
:ref:`Installing TensorFlow `. Ensure you have access to the Python
@@ -295,8 +289,8 @@ shell in the Docker container.
python -c 'import tensorflow' 2> /dev/null && echo ‘Success’ || echo ‘Failure’
-Running a basic TensorFlow example
-==================================
+Run a TensorFlow example
+================================================================================
To quickly validate your TensorFlow environment, run a basic TensorFlow example.
diff --git a/docs/install/3rd-party/verl-install.rst b/docs/install/3rd-party/verl-install.rst
index bc3c67fc..31becc67 100644
--- a/docs/install/3rd-party/verl-install.rst
+++ b/docs/install/3rd-party/verl-install.rst
@@ -3,17 +3,16 @@
:keywords: installation, docker, verl, AMD, ROCm
********************************************************************************
-verl on ROCm
+verl on ROCm installation
********************************************************************************
-Volcano Engine Reinforcement Learning for LLMs (verl) is a reinforcement learning framework designed for large language models (LLMs).
-See the `verl documentation `_ for more information about verl.
+Volcano Engine Reinforcement Learning for LLMs `(verl) `__
+is a reinforcement learning framework designed for large language models (LLMs).
-For hardware, software, and third-party framework compatibility between ROCm and verl,
-see the following resources:
-
-* :ref:`system-requirements`
-* :doc:`rocm:compatibility/ml-compatibility/verl-compatibility`
+This topic covers setup instructions and the necessary files to build, test, and run
+verl with ROCm support in a Docker environment. To learn more about verl
+on ROCm, including its use cases, recommendations, as well as hardware and software compatibility,
+see :doc:`rocm:compatibility/ml-compatibility/verl-compatibility`.
.. note::
@@ -71,7 +70,7 @@ Build your own Docker image
.. code-block:: bash
cd verl
- docker build -f docker/Dockerfile.rocm -t my-rocm-verl .
+ docker build -f docker/Dockerfile.rocm -t my-rocm-verl
3. Launch and connect to the container