Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 10 additions & 14 deletions docs/install/3rd-party/dgl-install.rst
Original file line number Diff line number Diff line change
@@ -1,26 +1,22 @@
.. meta::
:description: Install Deep Graph Library (DGL) on ROCm
:keywords: installation, docker, DGL, AMD, ROCm
:keywords: installation, docker, DGL, deep learning, AMD, ROCm

********************************************************************************
DGL on ROCm
DGL on ROCm installation
********************************************************************************

Deep Graph Library `(DGL) <https://www.dgl.ai/>`_ is an easy-to-use, high-performance and scalable
Python package for deep learning on graphs. DGL is framework agnostic, meaning
if a deep graph model is a component in an end-to-end application, the rest of
the logic is implemented using PyTorch.
Python package for deep learning on graphs.

.. |br| raw:: html
This topic covers setup instructions and the necessary files to build, test, and run
DGL with ROCm support in a Docker environment. To learn more about DGL on ROCm,
including its use cases, recommendations, as well as hardware and software compatibility,
see :doc:`rocm:compatibility/ml-compatibility/dgl-compatibility`.

<br/>
.. note::

For hardware, software, and third-party framework compatibility between ROCm and DGL,
see the following resources:

* :ref:`system-requirements`

* :doc:`rocm:compatibility/ml-compatibility/dgl-compatibility`
DGL is supported on ROCm 6.4.0.

Install DGL
================================================================================
Expand Down Expand Up @@ -73,7 +69,7 @@ Docker image support

AMD validates and publishes ready-made `DGL Docker images <https://hub.docker.com/r/rocm/dgl>`_
with ROCm backends on Docker Hub. The following Docker image tags and associated inventories are
validated for ROCm 6.4.
validated for ROCm 6.4.0.

.. tab-set::

Expand Down
7 changes: 4 additions & 3 deletions docs/install/3rd-party/flashinfer-install.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
.. meta::
:description: Installing FlashInfer for ROCm
:keywords: installation, docker, FlashInfer, AMD, ROCm
:description: Install FlashInfer on ROCm
:keywords: installation, docker, FlashInfer, deep learning, AMD, ROCm

********************************************************************************
FlashInfer on ROCm installation
Expand All @@ -10,7 +10,8 @@ FlashInfer on ROCm installation
for Large Language Models (LLMs) that provides high-performance implementation of graphics
processing units (GPUs) kernels.

This topic covers installation. To learn more about FlashInfer on ROCm,
This topic covers setup instructions and the necessary files to build, test, and run
FlashInfer with ROCm support in a Docker environment. To learn more about FlashInfer on ROCm,
including its use cases, recommendations, as well as hardware and software compatibility,
see :doc:`rocm:compatibility/ml-compatibility/flashinfer-compatibility`.

Expand Down
70 changes: 44 additions & 26 deletions docs/install/3rd-party/jax-install.rst
Original file line number Diff line number Diff line change
@@ -1,23 +1,36 @@
.. meta::
:description: JAX on ROCm
:keywords: installation instructions, building, JAX, AMD, ROCm
:description: Install JAX on ROCm
:keywords: installation, docker, JAX, deep learning, AMD, ROCm

***********
JAX on ROCm
***********
*************************************************************************************
JAX on ROCm installation
*************************************************************************************

This directory provides setup instructions and necessary files to build, test, and run JAX with ROCm support in a Docker environment, suitable for both runtime and CI workflows. Explore the following methods to use or build JAX on ROCm.
`JAX <https://docs.jax.dev/en/latest/notebooks/thinking_in_jax.html>`__ is a library
for array-oriented numerical computation (similar to NumPy), with automatic differentiation
and just-in-time (JIT) compilation to enable high-performance machine learning research.

For hardware, software, and third-party framework compatibility between ROCm and JAX, see the following resources:
This topic covers setup instructions and the necessary files to build, test, and run
JAX with ROCm support in a Docker environment. To learn more about JAX on ROCm,
including its use cases, recommendations, as well as hardware and software compatibility,
see :doc:`rocm:compatibility/ml-compatibility/jax-compatibility`.

* :ref:`system-requirements`
Install JAX on ROCm
======================================================================================

* :doc:`rocm:compatibility/ml-compatibility/jax-compatibility`
To install JAX on ROCm, you have the following options:

Using a prebuilt Docker image
===========================================
* :ref:`using-docker-with-jax-pre-installed` **(recommended)**
* :ref:`build-jax-rocm-docker-image`
* :ref:`install-jax-rocm-custom-container`
* :ref:`build-jax-from-source`

The ROCm JAX team provides prebuilt Docker images, which is the simplest way to use JAX on ROCm. These images are available on Docker Hub and come with JAX configured for ROCm.
.. _using-docker-with-jax-pre-installed:

Use a prebuilt Docker image with JAX preinstalled
--------------------------------------------------------------------------------------
The ROCm JAX team provides prebuilt Docker images, which is the simplest way to use JAX on ROCm.
These images are available on Docker Hub and come with JAX configured for ROCm.

1. To pull the latest ROCm JAX Docker image, run:

Expand All @@ -29,7 +42,7 @@ The ROCm JAX team provides prebuilt Docker images, which is the simplest way to

For specific versions of JAX, review the periodically pushed Docker images at `ROCm JAX on
Docker Hub <https://hub.docker.com/r/rocm/jax/tags>`_.

2. Once the image is downloaded, launch a container using the following command:

.. code-block:: bash
Expand All @@ -51,13 +64,13 @@ The ROCm JAX team provides prebuilt Docker images, which is the simplest way to

* The ``--shm-size`` parameter allocates shared memory for the container. Adjust it based on your system's resources if needed.
* Replace ``$(pwd)`` with the absolute path to the directory you want to mount inside the container.

3. Verify the installation of ROCm JAX. See :ref:`jax-verify-installation`.

.. _jax-docker-support:

Docker image support
--------------------
--------------------------------------------------------------------------------------

AMD validates and publishes ready-made JAX images with ROCm backends on Docker
Hub. The following Docker image tags and associated inventories are validated
Expand Down Expand Up @@ -97,8 +110,10 @@ For ``jax-community`` images, see `rocm/jax-community <https://hub.docker.com/r/
on `Docker Hub
<https://hub.docker.com/layers/rocm/jax/rocm7.0-jax0.6.0-py3.10/images/sha256-a75d73f696926c42434fa8910037e104b2d46211cf72a7f66f052a75e61f18f5>`__.

Using a ROCm base Docker image and installing JAX
=================================================
.. _build-jax-rocm-docker-image:

Use a ROCm base Docker image to install JAX
--------------------------------------------------------------------------------------

If you prefer to use the ROCm Ubuntu image or already have a ROCm Ubuntu container, follow these steps to install JAX in the container.

Expand Down Expand Up @@ -154,8 +169,10 @@ If you prefer to use the ROCm Ubuntu image or already have a ROCm Ubuntu contain

6. Verify the installation of ROCm JAX. See :ref:`jax-verify-installation`.

.. _install-jax-rocm-custom-container:

Install JAX on bare-metal or a custom container
===============================================
--------------------------------------------------------------------------------------

Follow these steps if you prefer to install ROCm manually on your host system or in a custom container.

Expand Down Expand Up @@ -230,8 +247,10 @@ Follow these steps if you prefer to install ROCm manually on your host system or

[0 1 2 3 4]

Build ROCm JAX from source
==========================
.. _build-jax-from-source:

Build JAX from source
--------------------------------------------------------------------------------------

Follow these steps to build JAX with ROCm support from source.

Expand Down Expand Up @@ -280,16 +299,15 @@ Follow these steps to build JAX with ROCm support from source.
source .venv/bin/activate
python3 setup.py develop --user && pip3 -m pip install dist/*.whl

Simplified build script
-----------------------
.. tip::

For a streamlined build process, consider using the ``jax/build/rocm/dev_build_rocm.py`` script. See
`<https://github.com/rocm/jax/tree/main/build/rocm>`__ for more information.
For a streamlined build process, consider using the ``jax/build/rocm/dev_build_rocm.py`` script. See
`<https://github.com/rocm/jax/tree/main/build/rocm>`__ for more information.

.. _jax-verify-installation:

Testing your JAX installation with ROCm
=======================================
Test the JAX installation
======================================================================================

After launching the container, test whether JAX detects ROCm devices as expected:

Expand Down
10 changes: 5 additions & 5 deletions docs/install/3rd-party/llama-cpp-install.rst
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
.. meta::
:description: Installing llama.cpp for ROCm
:keywords: installation instructions, llama.cpp, AMD, ROCm, GGML
:description: Install llama.cpp on ROCm
:keywords: installation, llama.cpp, docker, deep learning, AMD, ROCm, GGML

********************************************************************************
llama.cpp on ROCm installation
********************************************************************************

`llama.cpp <https://github.com/ggml-org/llama.cpp>`__ is an open-source framework
for Large Language Model (LLM) inference that runs on both central processing units
(CPUs) and graphics processing units (GPUs). It is written in plain C/C++, providing
a simple, dependency-free setup.
(CPUs) and graphics processing units (GPUs).

This topic covers installation. To learn more about llama.cpp on ROCm,
This topic covers setup instructions and the necessary files to build, test, and run
llama.cpp with ROCm support in a Docker environment. To learn more about llama.cpp on ROCm,
including its use cases, recommendations, as well as hardware and software compatibility,
see :doc:`rocm:compatibility/ml-compatibility/llama-cpp-compatibility`.

Expand Down
30 changes: 11 additions & 19 deletions docs/install/3rd-party/megablocks-install.rst
Original file line number Diff line number Diff line change
@@ -1,22 +1,18 @@
.. meta::
:description: Install Megablocks on ROCm
:keywords: installation, docker, Megablocks, AMD, ROCm
:keywords: installation, docker, Megablocks, deep learning, AMD, ROCm

********************************************************************************
Megablocks on ROCm
Megablocks on ROCm installation
********************************************************************************

Megablocks is a light-weight library for mixture-of-experts (MoE) training.
The core of the system is efficient "dropless-MoE" and standard MoE layers.
Megablocks is integrated with `https://github.com/stanford-futuredata/Megatron-LM <https://github.com/stanford-futuredata/Megatron-LM>`_,
where data and pipeline parallel training of MoEs is supported.
`Megablocks <https://github.com/databricks/megablocks>`__ is a light-weight library
for mixture-of-experts `(MoE) <https://huggingface.co/blog/moe>`__ training.


For hardware, software, and third-party framework compatibility between ROCm and Megablocks,
see the following resources:

* :ref:`system-requirements`
* :doc:`rocm:compatibility/ml-compatibility/megablocks-compatibility`
This topic covers setup instructions and the necessary files to build, test, and run
Megablocks with ROCm support in a Docker environment. To learn more about Megablocks on ROCm,
including its use cases, recommendations, as well as hardware and software compatibility,
see :doc:`rocm:compatibility/ml-compatibility/megablocks-compatibility`.

.. note::

Expand All @@ -30,16 +26,14 @@ To install Megablocks on ROCm, you have the following options:
* :ref:`using-docker-with-megablocks-pre-installed` **(recommended)**
* :ref:`build-megablocks-rocm-docker-image`


.. _using-docker-with-megablocks-pre-installed:

Using a prebuilt Docker image with Megablocks pre-installed
Use a prebuilt Docker image with Megablocks pre-installed
--------------------------------------------------------------------------------------

Docker is the recommended method to set up a Megablocks environment, and it avoids potential installation issues.
The tested, prebuilt image includes Megablocks, PyTorch, ROCm, and other dependencies.


1. Pull the Docker image

.. code-block:: bash
Expand All @@ -60,7 +54,6 @@ The tested, prebuilt image includes Megablocks, PyTorch, ROCm, and other depende
--ipc=host --shm-size 16G \
rocm/megablocks:megablocks-0.7.0_rocm6.3.0_ubuntu24.04_py3.12_pytorch2.4.0


.. _build-megablocks-rocm-docker-image:

Build your own Docker image
Expand All @@ -80,14 +73,13 @@ A Dockerfile is provided in the `https://github.com/ROCm/megablocks <https://git
.. code-block:: bash

cd megablocks
docker build -t rocm/megablocks:megablocks-0.7.0_rocm6.3.0_ubuntu24.04_py3.12_pytorch2.4.0 .
docker build -t rocm/megablocks:megablocks-0.7.0_rocm6.3.0_ubuntu24.04_py3.12_pytorch2.4.0

3. Run the docker container

.. code-block:: bash

docker run -it --device=/dev/kfd --device=/dev/dri --group-add video rocm/megablocks:megablocks-0.7.0_rocm6.3.0_ubuntu24.04_py3.12_pytorch2.4.0 .

docker run -it --device=/dev/kfd --device=/dev/dri --group-add video rocm/megablocks:megablocks-0.7.0_rocm6.3.0_ubuntu24.04_py3.12_pytorch2.4.0

Set up your datasets
======================================================================================
Expand Down
Loading