Deploy single-model on Red Hat Openshift AI inference endpoints internal url always in use 'https' not 'http'
Issue
- Deploy single-model on Red Hat Openshift AI, inference endpoints internal url always in use 'https' not 'http'. Refer to the following screenshot:
Environment
- Red Hat Openshift AI 2.19
Subscriber exclusive content
A Red Hat subscription provides unlimited access to our knowledgebase, tools, and much more.