curl crashes with a segmentation fault when copying a large file from an intranet sftp site using an sftp:// URL

Solution Verified - Updated -

Issue

  • curl on RHEL-6 has a bug which causes a segfault when a large file is copied from an intranet sftp site using an sftp:// URL.
  • The file size exceeds 2 GB
  • Steps to reproduce the problem:
    • Create a large file:
\#dd if=/dev/zero of=file.txt bs=1024 count=2500000
2500000+0 records in
2500000+0 records out
2560000000 bytes (2.6 GB) copied, 8.42809 s, 304 MB/s
  • Download the file using sftp:
\#curl -vu '$user:$passwd' --connect-timeout 5 sftp://remote.host/dir/file.txt > /dev/null

Environment

  • Red Hat Enterprise Linux 6
  • curl
  • libssh2-1.2.2-7.el6_1.1

Subscriber exclusive content

A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.

Current Customers and Partners

Log in for full access

Log In

New to Red Hat?

Learn more about Red Hat subscriptions

Using a Red Hat product through a public cloud?

How to access this content