Challenges for uploading Larger files (vmcore files)
Hello team pls consider my request:
We are having 3 layer network system, and majority of systems are having 100GB as RAM sizes, if system crashes, will get 100GM are core dump.
Challenges to upload into redhat site:
1: Splitting 100GB core file to 100*1GB files, thru our jump servers, I can upload veritas.com without any issues, but not able to upload to RedHat.com site.
2: Due to above issue, I need to bring all above splited files into local system, it taking more than 10 hrs
3: Once got into local system to upload into RedHat drop box each 1GB file taking 15 mintues,
So total Admin/Engineer has to spend more than 24 hrs. this is not acceptable by the customer,
Pls consider my concerns to do the needful,
contact me if needed details at Hanok@us.ibm.com
Responses
Dear Hanok,
Red Hat support has a dropbox.redhat.com feature, to use it you need to add the case number to the file name.
Please do not use the phrase "Do the needful". It is a phrase that translates in some European languages to "Do everything to solve my problem". It was introduced by officers that served in colonial forces.
Regards,
Jan Gerrit
Dear Hanok,
Red Hat recommends having kdump filter out pages of the dump that are unlikely to be useful when analyzing the dump file. This will likely shrink the vmcore file to something more manageable. We recommend having -d 31 on the kdump.conf core_collector line:
core_collector makedumpfile -c --message-level 1 -d 31
See How to troubleshoot kernel crashes, hangs, or reboots with kdump on Red Hat Enterprise Linux for more information on configuring kdump.
If you want to filter an existing vmcore, you could use makedumpfile to do that:
makedumpfile -c -d 31 vmcore vmcore.new
Regards,
Marc
Welcome! Check out the Getting Started with Red Hat page for quick tours and guides for common tasks.
