Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
:_newdoc-version: 2.18.3
:_template-generated: 2025-03-17

ifdef::context[:parent-context-of-solution-server-configurations: {context}]

:_mod-docs-content-type: ASSEMBLY

ifndef::context[]
[id="solution-server-configurations"]
endif::[]
ifdef::context[]
[id="solution-server-configurations_{context}"]
endif::[]
= Solution Server Configurations
:context: solution-server-configurations

[role=_abstract]
Solution server is a component that allows {mta-dl-plugin} to build a collective memory of code changes from all analysis performed in an organization. In the local context of an analysis run in the Visual Studio (VS) Code, the Solution Server augments previous patterns of how codes changed to resolve issues (also called solved examples) that were similar to those in the current file, and suggests a resolution that has a higher confidence level derived from previous solutions.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Solution server is a component that allows {mta-dl-plugin} to build a collective memory of code changes from all analysis performed in an organization. In the local context of an analysis run in the Visual Studio (VS) Code, the Solution Server augments previous patterns of how codes changed to resolve issues (also called solved examples) that were similar to those in the current file, and suggests a resolution that has a higher confidence level derived from previous solutions.
Solution server is a component that allows {mta-dl-plugin} to build a collective memory of code changes from all analysis performed in an organization. In the local context of an analysis run in the Visual Studio (VS) Code, the Solution Server augments previous patterns of how code changed to resolve issues (also called solved examples) that were similar to those in the current file, and suggests a resolution that has a higher confidence level derived from previous solutions.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does it make sense to expand a bit about ss?

The Solution Server delivers two primary benefits to users:

- **Contextual Hints:** It surfaces examples of past migration solutions—including successful user modifications and accepted fixes—offering actionable hints for difficult or previously unsolved migration problems.
- **Migration Success Metrics:** It exposes detailed success metrics for each migration rule, derived from real-world usage data. These metrics can be used by IDEs or automation tools to present users with a “confidence level” or likelihood of MTA lightspeed successfully migrating a given code segment.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@savitharaghunathan - thanks... that looks cool enough


As {mta-dl-plugin} is an optional set of features in {ProductShortName}, you must complete the following configurations before you can access settings necessary to use AI analysis.

include::topics/developer-lightspeed/proc_tackle-llm-secret.adoc[leveloffset=+1]

include::topics/developer-lightspeed/proc_tackle-enable-dev-lightspeed.adoc[leveloffset=+1]

ifdef::parent-context-of-solution-server-configurations[:context: {parent-context-of-solution-server-configurations}]
ifndef::parent-context-of-solution-server-configurations[:!context:]

1 change: 1 addition & 0 deletions assemblies/developer-lightspeed-guide/topics
1 change: 1 addition & 0 deletions docs/developer-lightspeed-guide/assemblies
13 changes: 13 additions & 0 deletions docs/developer-lightspeed-guide/master-docinfo.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<title>IntelliJ IDEA Plugin Guide</title>
<productname>{DocInfoProductName}</productname>
<productnumber>{DocInfoProductNumber}</productnumber>
<subtitle>Identify and resolve migration issues by analyzing your applications with the {ProductName} plugin for IntelliJ IDEA.</subtitle>
<abstract>
<para>
This guide describes how to use the {ProductName} plugin for IntelliJ IDEA to accelerate large-scale application modernization efforts across hybrid cloud environments on Red Hat OpenShift.
</para>
</abstract>
<authorgroup>
<orgname>Red Hat Customer Content Services</orgname>
</authorgroup>
<xi:include href="Common_Content/Legal_Notice.xml" xmlns:xi="http://www.w3.org/2001/XInclude" />
Comment on lines +1 to +13
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will have to do a re-write.

Please can we look at this together

21 changes: 21 additions & 0 deletions docs/developer-lightspeed-guide/master.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
:mta:
include::topics/templates/document-attributes.adoc[]
:_mod-docs-content-type: ASSEMBLY
[id="mta-developer-lightspeed"]
= MTA with Developer Lightspeed

:toc:
:toclevels: 4
:numbered:
:imagesdir: topics/images
:context: mta-developer-lightspeed
:mta-developer-lightspeed:

//Inclusive language statement
include::topics/making-open-source-more-inclusive.adoc[]



include::assemblies/developer-lightspeed-guide/assembly_solution-server-configurations.adoc[leveloffset=+1]

:!mta-developer-lightspeed:
1 change: 1 addition & 0 deletions docs/developer-lightspeed-guide/topics
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
:_newdoc-version: 2.15.0
:_template-generated: 2024-2-21
:_mod-docs-content-type: PROCEDURE

[id="tackle-enable-dev-lightspeed_{context}"]
= Enabling {mta-dl-plugin} in Tackle custom resource

[role="_abstract"]
Solution Server integrates with the {ProductShortName} Hub backend component to use the database and volumes necessary to store and retrieve the solved examples.

To enable Solution Server and other AI configurations in the {ProductShortName} VS Code extension, you must modify the Tackle custom resource (CR) with additional parameters.

.Prerequisites

//the hard link must be changed to the same topic in 8.0.0 that has the `{mta-dl-plugin}-database` req.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could use attributes

. You deployed an additional RWO volume for `{mta-dl-plugin}-database` if you want to use {mta-dl-plugin}. See link:https://docs.redhat.com/en/documentation/migration_toolkit_for_applications/7.3/html/user_interface_guide/mta-7-installing-web-console-on-openshift_user-interface-guide#openshift-persistent-volume-requirements_user-interface-guide[Persistent volume requirements] for more information.

. You installed the {ProductShortName} operator v8.0.0.


.Procedure

. Log in to the {ocp-short} cluster and switch to the `openshift-mta` project.
+
. Edit the Tackle custom resource (CR) settings in the `tackle_hub.yml` to enable {mta-dl-plugin} and then enter applicable values for `kai_llm_provider` and `kai_llm_model` variables.
+
[source, yaml]
----
---
kind: Tackle
apiVersion: tackle.konveyor.io/v1alpha1
metadata:
name: mta
namespace: openshift-mta
spec:
kai_solution_server_enabled: true
kai_llm_provider: ChatIBMGenAI
# optional, pick a suitable model for your provider
kai_llm_model: <model-name>
...
----
+
//How about Amazon Bedrock, Deepseek, and Azure OpenAI?
[NOTE]
====
The default large language model (LLM) provider is `ChatIBMGenAI`. You may also enter `google` or `openai` as providers.
====
+
. Apply the Tackle CR by using the following command.
+
[source, terminal]
----
$ kubectl apply -f tackle_hub.yaml
----

.Verification

. Enter the following command to verify the {mta-dl-plugin} resources.
+
[source, terminal]
----
kubectl get deploy,svc -n openshift-mta | grep -E 'kai-(api|db|importer)'
----
47 changes: 47 additions & 0 deletions docs/topics/developer-lightspeed/proc_tackle-llm-secret.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
:_newdoc-version: 2.15.0
:_template-generated: 2024-2-21
:_mod-docs-content-type: PROCEDURE

[id="tackle-llm-secret_{context}"]
= Configuring the model secret key

[role="_abstract"]

You must configure the Kubernetes secret for the large language model (LLM) provider in the {ocp-short} project where you installed the {ProductShortName} operator.

.Procedure

. Create a credentials secret named `kai-api-keys` in the `openshift-mta` project.
.. For the default IBM GenAI provider `ChatIBMGenAI`, type:
+
[source, terminal]
----
kubectl create secret generic kai-api-keys -n openshift-mta \
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does the oc equivalent work? just a wondering

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes.

--from-literal=genai_key='<YOUR_IBM_GENAI_KEY>'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This key is outdated. What should go in the literal here is whatever environment variables the provider you have configured expects. For example, if you're using openai models, they expect that there will be an environment variable named OPENAI_API_KEY. You would then create the secret with --from-literal=OPENAI_API_KEY='<your openai key>'. Whatever environment variables you configure in the secret like that will be injected directly into the pod.

----
+
.. For the Google as the provider, type:
+
[source, terminal]
----
kubectl create secret generic kai-api-keys -n openshift-mta \
--from-literal=google_key='<YOUR_GOOGLE_API_KEY>'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GEMINI_API_KEY for google I think

----
+
.. For the OpenAI-compatible providers, type:
+
[source, terminal]
----
kubectl create secret generic kai-api-keys -n openshift-mta \
--from-literal=api_base='https://api.openai.com/v1' \

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OPENAI_API_BASE (or can be set on the tackle cr itself with the kai_llm_baseurl variable, which I think would be the recommended way)

--from-literal=api_key='<YOUR_OPENAI_KEY>'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OPENAI_API_KEY

----
+
. (Optional) Force a reconcile so that the {ProductShortName} operator picks up the secret immediately
+
//Is the double tackle needed in the command?
[source, terminal]
----
kubectl patch tackle tackle -n openshift-mta --type=merge -p \
'{"metadata":{"annotations":{"konveyor.io/force-reconcile":"'"$(date +%s)"'"}}}'
----
6 changes: 6 additions & 0 deletions docs/topics/mta-7-installing-web-console-on-openshift.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,12 @@ To successfully deploy, the {ProductShortName} Operator requires 2 RWO persisten
|`100Gi`
|RWX
|Maven m2 cache; required if the `rwx_supported` configuration option is set to `true`

|`{mta-dl-plugin} database`
|`5Gi`
|`RWO`
|{mta-dl-plugin} database required to run an AI-driven analysis.
//verify the final d/s name of this resource.
|====

[id="installing-mta-operator-and-ui_{context}"]
Expand Down
1 change: 1 addition & 0 deletions docs/topics/templates/document-attributes.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ ifdef::mta[]
:WebConsoleBookName: {WebNameTitle} Guide
:ProductVersion: 7.3.1
:PluginName: MTA plugin
:mta-dl-plugin: MTA with Developer Lightspeed
// :MavenProductVersion: 7.0.0.GA-redhat-00001
:ProductDistributionVersion: 7.3.1.GA-redhat
:ProductDistribution: mta-7.3.1.GA-cli-offline.zip
Expand Down
Loading