Skip to content

Is FP8 tensor supported in GEMM? #45

@akaitsuki-ii

Description

@akaitsuki-ii

In the code of GemmSm90 it said:

Constraints:

  • Supported input data types: fp16, fp8 (e4m3fn, e5m2)

However in the GemmWrapperBase.validate_tensor method, fp8 tensor will break the assertion.

If fp8 tensor is not actually right now, how to adapt it? How to convert the fp8 tensor to CuTe tensor?

Thanks a lot!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions