Configuring Ansible Lightspeed Intelligent Assistant (ALIA) for Managed AAP Cloud Customers
Prerequisites: What You Must Have First
Before our Site Reliability Engineering (SRE) team can configure your intelligent assistant, you must have a supported, self-hosted LLM provider deployed and accessible. The intelligent assistant relies on this LLM to function.
- LLM Provider Requirement: You must host your own LLM provider.
Note: Your LLM provider must be configured to allow secure access from the Ansible Lightspeed deployment.
ALIA Configuration Process
As a managed cloud customer, the Red Hat SRE team will handle the entire setup process for you.
Here is the step-by-step process:
- Prepare for installation: Gather all the required LLM credentials
- Create a support ticket: notifying SREs of the process is starting
- Create and store credentials: store credentials in a txt file
- Run this command: to share txt file - Linked below
- Receive Confirmation: Once our SRE team has received your ticket, text file and confirmed that all necessary information is present, we will send you a confirmation support ticket.
- Implementation: The SRE Team will apply the configuration to your managed cloud deployment. Once completed, they will update the support ticket, allowing you to verify the connection and begin using ALIA. The entire implementation will be completed within 5 business days from the date of our confirmation email.
Information Required in Text File
To prevent delays, please gather the following information from your LLM provider deployment and include it in your file. Our SREs will use this information to configure the connection.
- LLM Model Name: The specific name of the LLM model configured on your setup (e.g., chatbot_model).
- Inference API Base URL: The full base URL for the inference API (e.g., https://your_inference_api/v1).
- API Token / Key: The API token or key required to authorize calls to your LLM's inference API.
Script to share credential information:
Run this command:
curl -X PUT --upload-file <text_file_name> 'https://unique_url_provided_by_SRE_team'
The quotes around the URL as important as the URL provided by the SRE team will have special characters.
Requirement and Purpose:
The API Token / Key is a mandatory requirement for the Red Hat Site Reliability Engineering (SRE) team to perform the initial, one-time configuration of the Ansible Lightspeed intelligent assistant (ALIA) connection to your self-hosted LLM provider. Without this credential, the connection cannot be established, and ALIA cannot be enabled.
Security and Exposure Limitation:
We understand that customers are cautious about sharing sensitive credentials. Once used for the configuration, the token/key is securely stored and its access is strictly limited within your managed application deployment. The SRE team will not expose or utilize the token/key for any purpose other than maintaining and troubleshooting the ALIA-LLM connection as part of the managed service. Its exposure is confined solely to this specific configuration and support activity.
Comments