Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions config/crd/bases/aiconnect.ansible.com_ansibleaiconnects.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,30 @@ spec:
chatbot_image_version:
description: Chatbot API container image tag or version to use
type: string
chatbot_rag_db_image:
description: Registry path to Chatbot RAG database container image to use
type: string
chatbot_rag_db_image_version:
description: Chatbot RAG database container image tag or version to use
type: string
chatbot_mcp_gateway_image:
description: Registry path to AAP Gateway MCP server container image to use
type: string
chatbot_mcp_gateway_image_version:
description: AAP Gateway MCP server container image tag or version to use
type: string
chatbot_mcp_controller_image:
description: Registry path to AAP Controller MCP server container image to use
type: string
chatbot_mcp_controller_image_version:
description: AAP Controller MCP server container image tag or version to use
type: string
chatbot_mcp_lightspeed_image:
description: Registry path to AAP Lightspeed MCP server container image to use
type: string
chatbot_mcp_lightspeed_image_version:
description: AAP Lightspeed MCP server container image tag or version to use
type: string
additional_labels:
description: Additional labels defined on the resource, which should be propagated to child resources
type: array
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,38 @@ spec:
path: chatbot_image_version
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: Chatbot RAG database container image
path: chatbot_rag_db_image
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: Chatbot RAG database container image Version
path: chatbot_rag_db_image_version
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: AAP Gateway MCP server container image
path: chatbot_mcp_gateway_image
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: AAP Gateway MCP server container image Version
path: chatbot_mcp_gateway_image_version
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: AAP Controller MCP server container image
path: chatbot_mcp_controller_image
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: AAP Controller MCP server container image Version
path: chatbot_mcp_controller_image_version
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: AAP Lightspeed MCP server container image
path: chatbot_mcp_lightspeed_image
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: AAP Lightspeed MCP server container image Version
path: chatbot_mcp_lightspeed_image_version
x-descriptors:
- urn:alm:descriptor:com.tectonic.ui:advanced
- displayName: Additional labels defined on the resource, which should be propagated
to child resources
path: additional_labels
Expand Down
45 changes: 13 additions & 32 deletions docs/using-external-configuration-secrets.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,10 +88,8 @@ stringData:
chatbot_url: <Chatbot LLM URL>
chatbot_model: <Chatbot model name>
chatbot_token: <Chatbot LLM access token>
chatbot_llm_provider_type: <Chatbot LLM provider type>
chatbot_llm_provider_project_id: <Chatbot LLM provider project id>
chatbot_context_window_size: <Chatbot LLM context window size>
chatbot_temperature_override: <Chatbot LLM temperature parameter>
aap_gateway_url: <AAP Gateway URL>
aap_controller_url: <AAP Controller URL>
type: Opaque
```
**Required Parameters**
Expand All @@ -100,38 +98,21 @@ type: Opaque
* `chatbot_model`
* `chatbot_token`

**Optional Parameters**
**Parameter combinations**

Both `chatbot_llm_provider_type` and `chatbot_context_window_size` are optional. If either is omitted, the
following default values will be used:
Providing `aap_gateway_url` and/or `aap_controller_url` affect how the Chatbot is provisioned.

* `chatbot_llm_provider_type`: `rhoai_vllm`
* `chatbot_context_window_size`: `128000`
If none of these parameters are provided no MCP servers will be provisioned or registered with the underlying LLM's tool runtime.

When `chatbot_llm_provider_type` is set to `watsonx`,
`chatbot_llm_provider_project_id` is required to set to your watsonx.ai project ID.
If only `aap_gateway_url` is set the following MCP server will be provisioned:
- Ansible Lightspeed Service MCP server.
- Authentication will attempt to use the JWT token associated with the User's authenticated context.

`chatbot_temperature_override` is also optional. It is provided for
following Open AI models that do not support the default temperature setting used for
other Open AI models:

* `o1`
* `o3-mini`
* `o4-mini`

When one of these models is used, set `chatbot_temperature_override` to `null`,
which disables the default temperature setting.


**_Azure AI_ Parameters**

In case of using the _Azure AI_ model provider, please also ensure the following required parameters are properly set:
* `chatbot_llm_provider_type`: `azure_openai`
* `chatbot_url`: `<<Azure AI project serving URL>>`
* `chatbot_model`: `<<Azure AI model name>>`
* `chatbot_token`: `<<Azure AI access token>>`
* `chatbot_azure_deployment_name`: `<<Azure AI [deployment name](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=web-portal#deploy-a-model)>>`
* `chatbot_azure_api_version`: `<<Optional - Azure AI API version>>`
If `aap_gateway_url` and `aap_controller_url` are set the following MCP servers will be provisioned:
- AAP Controller Service MCP server
- Authentication will attempt to use either the JWT token associated with the User's authenticated context.
- Ansible Lightspeed Service MCP server
- Authentication will attempt to use either the JWT token associated with the User's authenticated context.

## Troubleshooting

Expand Down
14 changes: 13 additions & 1 deletion roles/chatbot/defaults/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,21 @@
api_version: 'aiconnect.ansible.com/v1alpha1'
deployment_type: 'ansible-ai-connect'

_chatbot_image: quay.io/ansible/ansible-chatbot-service
# Chatbot container image
_chatbot_image: quay.io/ansible/ansible-chatbot-stack
_chatbot_image_version: "{{ lookup('env', 'DEFAULT_CHATBOT_AI_CONNECT_VERSION') or 'latest' }}"

# Chatbot RAG database container images
_chatbot_rag_db_image: quay.io/ansible/aap-rag-content
_chatbot_rag_db_image_version: "{{ lookup('env', 'DEFAULT_CHATBOT_RAG_DB_VERSION') or 'latest' }}"

# Chatbot MCP Server container images
_chatbot_mcp_gateway_image: quay.io/ansible/ansible-mcp-gateway
_chatbot_mcp_gateway_image_version: "{{ lookup('env', 'DEFAULT_CHATBOT_MCP_GATEWAY_VERSION') or 'latest' }}"
_chatbot_mcp_controller_image: quay.io/ansible/ansible-mcp-controller
_chatbot_mcp_controller_image_version: "{{ lookup('env', 'DEFAULT_CHATBOT_MCP_CONTROLLER_VERSION') or 'latest' }}"
_chatbot_mcp_lightspeed_image: quay.io/ansible/ansible-mcp-lightspeed
_chatbot_mcp_lightspeed_image_version: "{{ lookup('env', 'DEFAULT_CHATBOT_MCP_LIGHTSPEED_VERSION') or 'latest' }}"
# ========================================


Expand Down
12 changes: 5 additions & 7 deletions roles/chatbot/tasks/deploy_chatbot_api.yml
Original file line number Diff line number Diff line change
@@ -1,17 +1,15 @@
---
- name: Apply Chatbot ConfigMap resources
kubernetes.core.k8s:
apply: yes
definition: "{{ lookup('template', 'chatbot.configmap.yaml.j2') }}"
wait: yes

- name: Apply Chatbot deployment resources
kubernetes.core.k8s:
apply: yes
definition: "{{ lookup('template', item + '.yaml.j2') }}"
wait: no
loop:
- 'chatbot.pvc'
- 'chatbot.service'
- 'chatbot.configmap_lightspeed_stack_config'
- 'chatbot.configmap_llama_stack_config'
- 'chatbot.configmap_system_prompt'
- 'chatbot.deployment'

- name: Check for Chatbot Pod
Expand All @@ -30,7 +28,7 @@
- "chatbot_api_pod['resources'] | length"
- "chatbot_api_pod['resources'][0]['status']['phase'] == 'Running'"
- "chatbot_api_pod['resources'][0]['status']['containerStatuses'][0]['ready'] == true"
retries: 60
retries: 30
delay: 5

- name: Set the Chatbot API Pod name as a variable.
Expand Down
3 changes: 3 additions & 0 deletions roles/chatbot/tasks/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@
- name: Set AnsibleAIConnect's Chatbot service images
ansible.builtin.include_tasks: set_images.yml

- name: Set AnsibleAIConnect and AnsibleAIConnect's Chatbot service URLs
ansible.builtin.include_tasks: set_service_urls.yml

- name: Read AnsibleAIConnect's Chatbot secret
ansible.builtin.include_tasks: read_chatbot_configuration_secret.yml

Expand Down
78 changes: 12 additions & 66 deletions roles/chatbot/tasks/read_chatbot_configuration_secret.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,82 +43,28 @@
You must specify a 'chatbot_token' in your Secret.
when: not _chatbot_config_resource["resources"][0]["data"].chatbot_token

- name: Set Chatbot Configuration
- name: Set Chatbot Token
ansible.builtin.set_fact:
chatbot_config: '{{ _chatbot_config_resource }}'
chatbot_token: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_token | b64decode }}'
no_log: "{{ no_log }}"

- name: Set LLM provider type if it is defined in the config
- name: Set AAP Gateway URL
ansible.builtin.set_fact:
chatbot_llm_provider_type: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_llm_provider_type | b64decode }}'
_aap_gateway_url: '{{ _chatbot_config_resource["resources"][0]["data"].aap_gateway_url | b64decode }}'
no_log: "{{ no_log }}"
when: _chatbot_config_resource["resources"][0]["data"].chatbot_llm_provider_type is defined

- name: Validate watsonx.ai project ID if LLM provider type is set to "watsonx"
ansible.builtin.fail:
msg: |
You must specify a 'chatbot_llm_provider_project_id' in your Secret when 'chatbot_llm_provider_type' is set to 'watsonx'
when:
- chatbot_llm_provider_type is defined
- chatbot_llm_provider_type == "watsonx"
- not _chatbot_config_resource["resources"][0]["data"].chatbot_llm_provider_project_id is defined

- name: Set watsonx.ai project ID if it is defined in the config
ansible.builtin.set_fact:
chatbot_llm_provider_project_id: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_llm_provider_project_id | b64decode }}'
no_log: "{{ no_log }}"
when: _chatbot_config_resource["resources"][0]["data"].chatbot_llm_provider_project_id is defined
- _chatbot_config_resource["resources"][0]["data"].aap_gateway_url is defined
- _chatbot_config_resource["resources"][0]["data"].aap_gateway_url | length

- name: Set context window size if it is defined in the config
- name: Set AAP Controller URL
ansible.builtin.set_fact:
chatbot_context_window_size: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_context_window_size | b64decode | int }}'
no_log: "{{ no_log }}"
when: _chatbot_config_resource["resources"][0]["data"].chatbot_context_window_size is defined

- name: Set LLM temperature parameter override if it is defined in the config
ansible.builtin.set_fact:
chatbot_temperature_override: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_temperature_override | b64decode }}'
no_log: "{{ no_log }}"
when: _chatbot_config_resource["resources"][0]["data"].chatbot_temperature_override is defined

- name: Set Azure AI deployment name if it is defined in the config
ansible.builtin.set_fact:
chatbot_azure_deployment_name: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_azure_deployment_name | b64decode }}'
_aap_controller_url: '{{ _chatbot_config_resource["resources"][0]["data"].aap_controller_url | b64decode }}'
no_log: "{{ no_log }}"
when:
- chatbot_llm_provider_type is defined
- chatbot_llm_provider_type is search("azure_openai")
- _chatbot_config_resource["resources"][0]["data"].chatbot_azure_deployment_name is defined
- _chatbot_config_resource["resources"][0]["data"].aap_controller_url is defined
- _chatbot_config_resource["resources"][0]["data"].aap_controller_url | length

- name: Set Azure AI API version if it is defined in the config
ansible.builtin.set_fact:
chatbot_azure_api_version: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_azure_api_version | b64decode }}'
no_log: "{{ no_log }}"
when:
- chatbot_llm_provider_type is defined
- chatbot_llm_provider_type is search("azure_openai")
- _chatbot_config_resource["resources"][0]["data"].chatbot_azure_api_version is defined

- name: Set Chatbot Include Fake LLMs to false if it is not defined in the config
ansible.builtin.set_fact:
chatbot_include_fake_llms: false
no_log: "{{ no_log }}"
when: not _chatbot_config_resource["resources"][0]["data"].chatbot_include_fake_llms is defined

- name: Set Chatbot Include Fake LLMs if it is defined in the config
ansible.builtin.set_fact:
chatbot_include_fake_llms: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_include_fake_llms | default(false) | b64decode }}'
no_log: "{{ no_log }}"
when: _chatbot_config_resource["resources"][0]["data"].chatbot_include_fake_llms is defined

- name: Set Chatbot Fake Streaming Chunks
ansible.builtin.set_fact:
chatbot_fake_streaming_chunks: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_fake_streaming_chunks | b64decode }}'
no_log: "{{ no_log }}"
when: _chatbot_config_resource["resources"][0]["data"].chatbot_fake_streaming_chunks is defined

- name: Set Chatbot Fake Streaming Sleep
- name: Set Chatbot Configuration
ansible.builtin.set_fact:
chatbot_fake_streaming_sleep: '{{ _chatbot_config_resource["resources"][0]["data"].chatbot_fake_streaming_sleep | b64decode }}'
chatbot_config: '{{ _chatbot_config_resource }}'
no_log: "{{ no_log }}"
when: _chatbot_config_resource["resources"][0]["data"].chatbot_fake_streaming_sleep is defined
27 changes: 25 additions & 2 deletions roles/chatbot/tasks/remove_chatbot_api.yml
Original file line number Diff line number Diff line change
@@ -1,9 +1,25 @@
---
- name: Remove Chatbot ConfigMap resources
- name: Remove Chatbot Lightspeed Stack ConfigMap resources
kubernetes.core.k8s:
state: absent
kind: ConfigMap
name: '{{ ansible_operator_meta.name }}-{{ deployment_type }}-chatbot-env-properties'
name: '{{ ansible_operator_meta.name }}-{{ deployment_type }}-lightspeed-stack-config'
namespace: '{{ ansible_operator_meta.namespace }}'
wait: yes

- name: Remove Chatbot Llama Stack ConfigMap resources
kubernetes.core.k8s:
state: absent
kind: ConfigMap
name: '{{ ansible_operator_meta.name }}-{{ deployment_type }}-llama-stack-config'
namespace: '{{ ansible_operator_meta.namespace }}'
wait: yes

- name: Remove Chatbot System Prompt ConfigMap resources
kubernetes.core.k8s:
state: absent
kind: ConfigMap
name: '{{ ansible_operator_meta.name }}-{{ deployment_type }}-system-prompt'
namespace: '{{ ansible_operator_meta.namespace }}'
wait: yes

Expand All @@ -22,3 +38,10 @@
name: '{{ ansible_operator_meta.name }}-chatbot-api'
namespace: '{{ ansible_operator_meta.namespace }}'
wait: yes

- name: Remove Chatbot PVC
kubernetes.core.k8s:
state: absent
kind: PersistentVolumeClaim
name: "{{ ansible_operator_meta.name }}-chatbot-pvc"
namespace: "{{ ansible_operator_meta.namespace }}"
Loading
Loading