<Vulnerability name="CVE-2026-7141">
    <DocumentDistribution xml:lang="en">Copyright © 2012 Red Hat, Inc. All rights reserved.</DocumentDistribution>
    <ThreatSeverity>Moderate</ThreatSeverity>
    <PublicDate>2026-04-27T16:45:12</PublicDate>
    <Bugzilla id="2463365" url="https://bugzilla.redhat.com/show_bug.cgi?id=2463365" xml:lang="en:us">
vllm: vllm: Uninitialized resource in KV Block Handler via has_mamba_layers function
    </Bugzilla>
    <CVSS3 status="draft">
        <CVSS3BaseScore>5.6</CVSS3BaseScore>
        <CVSS3ScoringVector>CVSS:3.1/AV:N/AC:H/PR:N/UI:N/S:U/C:L/I:L/A:L</CVSS3ScoringVector>
    </CVSS3>
    <CWE>CWE-908</CWE>
    <Details xml:lang="en:us" source="Red Hat">
A flaw was found in vllm. A remote attacker can exploit a vulnerability in the `has_mamba_layers` function within the KV Block Handler component. By performing a specific manipulation, an uninitialized resource can be triggered, potentially leading to information disclosure or denial of service. The attack is considered to have high complexity, and exploitation is difficult.
    </Details>
    <Statement xml:lang="en:us">
This Moderate impact vulnerability in vllm, affecting Red Hat AI Inference Server, Red Hat OpenShift AI, and Red Hat Enterprise Linux AI, involves an uninitialized resource in the `has_mamba_layers` function. Exploitation by a remote attacker could lead to information disclosure or denial of service, but the attack complexity is high, making exploitation difficult.
    </Statement>
    <Mitigation xml:lang="en:us">
Mitigation for this issue is either not available or the currently available options do not meet the Red Hat Product Security criteria comprising ease of use and deployment, applicability to widespread installation base, or stability.
    </Mitigation>
    <PackageState cpe="cpe:/a:redhat:ai_inference_server:3">
        <ProductName>Red Hat AI Inference Server</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhaiis/vllm-cpu-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:ai_inference_server:3">
        <ProductName>Red Hat AI Inference Server</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhaiis/vllm-cuda-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:ai_inference_server:3">
        <ProductName>Red Hat AI Inference Server</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhaiis/vllm-neuron-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:ai_inference_server:3">
        <ProductName>Red Hat AI Inference Server</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhaiis/vllm-rocm-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:ai_inference_server:3">
        <ProductName>Red Hat AI Inference Server</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhaiis/vllm-spyre-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:ai_inference_server:3">
        <ProductName>Red Hat AI Inference Server</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhaiis/vllm-tpu-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:enterprise_linux_ai:3">
        <ProductName>Red Hat Enterprise Linux AI (RHEL AI) 3</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhelai3/bootc-aws-cuda-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:enterprise_linux_ai:3">
        <ProductName>Red Hat Enterprise Linux AI (RHEL AI) 3</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhelai3/bootc-azure-cuda-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:enterprise_linux_ai:3">
        <ProductName>Red Hat Enterprise Linux AI (RHEL AI) 3</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhelai3/bootc-azure-rocm-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:enterprise_linux_ai:3">
        <ProductName>Red Hat Enterprise Linux AI (RHEL AI) 3</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhelai3/bootc-cuda-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:enterprise_linux_ai:3">
        <ProductName>Red Hat Enterprise Linux AI (RHEL AI) 3</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhelai3/bootc-gcp-cuda-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:enterprise_linux_ai:3">
        <ProductName>Red Hat Enterprise Linux AI (RHEL AI) 3</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhelai3/bootc-rocm-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:openshift_ai">
        <ProductName>Red Hat OpenShift AI (RHOAI)</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhoai/odh-kserve-agent-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:openshift_ai">
        <ProductName>Red Hat OpenShift AI (RHOAI)</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhoai/odh-kserve-controller-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:openshift_ai">
        <ProductName>Red Hat OpenShift AI (RHOAI)</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhoai/odh-kserve-router-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:openshift_ai">
        <ProductName>Red Hat OpenShift AI (RHOAI)</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhoai/odh-kserve-storage-initializer-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:openshift_ai">
        <ProductName>Red Hat OpenShift AI (RHOAI)</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhoai/odh-vllm-cuda-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:openshift_ai">
        <ProductName>Red Hat OpenShift AI (RHOAI)</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhoai/odh-vllm-gaudi-rhel9</PackageName>
    </PackageState>
    <PackageState cpe="cpe:/a:redhat:openshift_ai">
        <ProductName>Red Hat OpenShift AI (RHOAI)</ProductName>
        <FixState>Fix deferred</FixState>
        <PackageName>rhoai/odh-vllm-rocm-rhel9</PackageName>
    </PackageState>
    <References xml:lang="en:us">
https://www.cve.org/CVERecord?id=CVE-2026-7141
https://nvd.nist.gov/vuln/detail/CVE-2026-7141
https://github.com/AjAnubolu/vllm/commit/1ad67864c0c20f167929e64c875f5c28e1aad9fd
https://github.com/vllm-project/vllm/issues/39146
https://github.com/vllm-project/vllm/issues/39146#issue-4215090365
https://github.com/vllm-project/vllm/pull/39283
https://vuldb.com/submit/801297
https://vuldb.com/vuln/359740
https://vuldb.com/vuln/359740/cti
    </References>
</Vulnerability>