How to configure Red Hat AI Inference server to use multiple GPUs on the system?

Solution Verified - Updated -

Issue

  • How to configure Red Hat AI Inference server to use multiple GPUs on the system (on Nvidia L4 GPU)?

Environment

  • Red Hat AI Inference server (RHAIIS)
    • 3.0
  • Nvidia L4 GPU

Subscriber exclusive content

A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.

Current Customers and Partners

Log in for full access

Log In

New to Red Hat?

Learn more about Red Hat subscriptions

Using a Red Hat product through a public cloud?

How to access this content