import/export patch content between connected and disconnected Satellite

Latest response

Hey All,
before I begin, here is my situation. We want to use Satellite to patch our RHEL servers across various air-gaped networks. We have a connected Satellite server we use to download all patch content we need. We then need to import all the content from the connected to various air-gaped/disconnected Satellite servers.

I'm trying to figure out the best method to do this. This is what I'm thinking so far:

On the connected satellite, I create a content view, I call it "Master_Content". This view has all content I want to import into my disconnected satellite servers (currently 550GB). I then run an export of the content view to a folder. I copy the folder to an external storage device, and carry it over to my 1st disconnected satellite server. I create a new folder and copy the content here. I log into my disconnected satellite web UI, update the Red Hat CDN to the new folder and upload the manifest.

At this point I now have a local repository of content for the disconnected satellite. The disconnected satellite server does not have any content view or anything so this gives me the flexibility to leverage content views for that network as I see fit.

I think this is the correct way to do this, but the questions I have now are:

  1. I did my first large export of content to my disconnected satellites, in two weeks I need to repeat the process, but I only want to export/import all new packages and not the entire content. Is this possible? My thought on this is I would create a new version of "Master_Content" content view with a filter to exclude any content from the previous version. I would then export/import that into the disconnected Satellite. But I have a follow up question to this:

  2. The content view export folder name is based on the version, and my disconnected satellite CDN is pointed to the original content view. How would I import the new content into the original content on the disconnected satellite. The challenge I'm seeing is that the disconnected satellite doesn't see the exported content view as a true content view, so how to I have two content views in on CDN URL?

I suspect I'm on the right track, but I feel I'm missing some information that could help. This is why I'm reaching out, I'm hoping someone can give me some perspective on how I'm handling this, if I'm on the right track, or off to left field so to speak.

Thanks in advance for any assist I get.

Responses

Hello, for question 1, does the Incremental Updates section of the Content Management Guide help?

As to question 2, I have not done this so someone else might have a better idea, but:

When you originally imported the first exported CV that became a repo on the disconnected Satellite and by default those RPMs are in the Library. See Importing a Content View as a Red Hat Repository.

You could create Content Views (CV) from that Library if you wanted to control when the Content Hosts get updated.

If you followed the procedure Importing a Content View as a Red Hat Repository verbatim every time I think you would end up with a new repo every time because step 4 implies creating a new directory name, for example Export_CV/1.0/. But you do not want to create a new repo, just update it.

I suspect, best wait for others to agree, that if you have the incremental update mounted in the same location as when you first imported content and created a repo, you could just use the repo sync function to update the existing repo. I am not sure why we did not mention this in the docs. I will search to see if we have a bug already to do this, if not, I will raise one.

Thank you

Great find! I basically have the same scenario. I need a full mirror of the CDN on my disconnected Satellite, with the ability to incrementally update that mirror.

Hello, thanks for the reply. The Incremental Updates section does help. It appears that it may be more efficient to perform exports/imports of repos, rather than using CV. Considering when importing a CV into a disconnected sat, that sat doesn't really see it as a CV, but just a folder of files, so its not a real CV at that point. So I'm thinking maybe I should just export each repo (hammer repo export), then import into a generic folder, then run the hammer repo sync commands to sync the content. Working at the repo layer I could also take advantage of the incremental update procedures your linked above. Quick question though, the hammer command expects an specific repo id for each export...we have about 2 dozen or so repos to export/import, so is there a way to specify all ids in a single command, or will I have to look at creating some scripts?

Would be curious if there is a document similar to "Buiding a CDN mirror - A practical example using Content ISOs" but using ISS? And follow that up with importing incremental updates to your mirror using ISS as well???

Close

Welcome! Check out the Getting Started with Red Hat page for quick tours and guides for common tasks.