The Mellanox InfiniBand EDR/Ethernet 100Gb 1-port Adapter requires the Mellanox OpenSM package to achieve 100 GB speeds

Solution In Progress - Updated -

Environment

  • Red Hat Enterprise Linux(RHEL) ALT 7.5/8.0
  • Mellanox InfiniBand EDR/Ethernet 100Gb 1-port Adapter
  • HPE Apollo 70 System

Issue

  • The Mellanox InfiniBand EDR/Ethernet 100Gb 1-port Adapter shipped with the HPE Apollo 70 server can achieve the top advertised speed of 100GB with the native driver included in RHEL 7.5 and 8.0. However, the Mellanox provided OpenSM package also needs to be running as part of the infiniband network topology on the same network in order for this throughput to be achieved. Without the Mellanox provided OpenSM package running on the infiniband network topology, systems with this configuration may only achieve up to 50GB line speed.

Resolution

  • For the Mellanox InfiniBand card to reach 100GB line speed the Mellanox OpenSM package must be installed and configured on a system that is part of the same network as the HPE Apollo 70. For additional details please refer to the networking section of the HPE Apollo 70 datasheet: https://psnow.ext.hpe.com/doc/PSN1010742472USEN.pdf

  • Additionally, the following link contains technical details for the Mellanox InfiniBand EDR/Ethernet 100Gb 1-port Adapter: https://h20195.www2.hpe.com/v2/GetDocument.aspx?docname=a00039978enw

  • Please note that any use of the Mellanox provided OpenSM package will be supported by Red Hat under our third-party software and drivers policy as defined at the following url: https://access.redhat.com/articles/1067

  • Please contact your system vendor for further details.

This solution is part of Red Hat’s fast-track publication program, providing a huge library of solutions that Red Hat engineers have created while supporting our customers. To give you the knowledge you need the instant it becomes available, these articles may be presented in a raw and unedited form.