Xorg cpu consumption shoots up while using NVidia driver

Solution Verified - Updated -

Environment

  • Red Hat Enterprise Linux 6
  • Red Hat Enterprise Linux 5
  • Red Hat Enterprise Linux 4

Issue

  • Xorg CPU consumption increases while using Nvidia proprietary driver.
  • Xorg seems to freeze with high cpu usage with many NVidia cards while using Nvidia proprietary driver.

Resolution

  • Use Option "UseEvents" "on" in xorg.conf to resolve this issue.
Section "Device"
Identifier "Videocard0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce FX 5200"
Option "RenderAccel" "on"
Option "UseEvents" "on"
EndSection

Root Cause

As per vendor's documentation about driver, this option enables the use of system events in some cases when the X driver is waiting for the hardware. The X driver can briefly spin through a tight loop when waiting for the hardware.
With this option the X driver instead sets an event handler and waits for the hardware through the poll() system call.

This solution is part of Red Hat’s fast-track publication program, providing a huge library of solutions that Red Hat engineers have created while supporting our customers. To give you the knowledge you need the instant it becomes available, these articles may be presented in a raw and unedited form.

Close

Welcome! Check out the Getting Started with Red Hat page for quick tours and guides for common tasks.