Wget/Curl cannot download a file larger than the RAM limit (When implementing memory limits via cgroups v2)
Issue
- Users face unacceptable behavior (takes a very long time) when attempting to download large files
- wget/curl session is extremely slow during download of a file that is larger than the cgroup v2 memory limit
- The issue does not occur for cgroups v1
-
Create a test cgroup
mkdir /sys/fs/cgroup/testgroup
-
Associate a running process with specific cgroup using the command below which updates cgroup.procs file:
echo $$ > /sys/fs/cgroup/testgroup/cgroup.procs
-
Set Memory Limit (use a very small limit. Example 100 MiB) using command below:
echo "100000000" > /sys/fs/cgroup/testgroup/memory.max
-
wget
large_file
Raw output: wget http://111.222.333.444/1GB.zip --2023-12-06 12:30:36-- http://111.222.333.444/1GB.zip Connecting to 111.222.333.444:80... connected. HTTP request sent, awaiting response... 200 OK Length: 1073741824 (1.0G) [application/zip] Saving to: ‘1GB.zip’ 1GB.zip 9%[======>] 94.79M 3.49KB/s eta 12m 16s^
Environment
- Red Hat Enterprise Linux (RHEL) 9.1, 9.2
Subscriber exclusive content
A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.