Skip to content

Conversation

Pkylas007
Copy link
Collaborator

JIRA

Version

  • 8.0.0

Preview

Signed-off-by: Prabha Kylasamiyer Sundara Rajan <[email protected]>
@Pkylas007 Pkylas007 force-pushed the mta-5378-enable-solution-server-tackle branch from 24cfeac to cc487ac Compare August 24, 2025 08:12
:context: solution-server-configurations

[role=_abstract]
Solution server is a component that allows {mta-dl-plugin} to build a collective memory of code changes from all analysis performed in an organization. In the local context of an analysis run in the Visual Studio (VS) Code, the Solution Server augments previous patterns of how codes changed to resolve issues (also called solved examples) that were similar to those in the current file, and suggests a resolution that has a higher confidence level derived from previous solutions.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Solution server is a component that allows {mta-dl-plugin} to build a collective memory of code changes from all analysis performed in an organization. In the local context of an analysis run in the Visual Studio (VS) Code, the Solution Server augments previous patterns of how codes changed to resolve issues (also called solved examples) that were similar to those in the current file, and suggests a resolution that has a higher confidence level derived from previous solutions.
Solution server is a component that allows {mta-dl-plugin} to build a collective memory of code changes from all analysis performed in an organization. In the local context of an analysis run in the Visual Studio (VS) Code, the Solution Server augments previous patterns of how code changed to resolve issues (also called solved examples) that were similar to those in the current file, and suggests a resolution that has a higher confidence level derived from previous solutions.

Comment on lines +1 to +13
<title>IntelliJ IDEA Plugin Guide</title>
<productname>{DocInfoProductName}</productname>
<productnumber>{DocInfoProductNumber}</productnumber>
<subtitle>Identify and resolve migration issues by analyzing your applications with the {ProductName} plugin for IntelliJ IDEA.</subtitle>
<abstract>
<para>
This guide describes how to use the {ProductName} plugin for IntelliJ IDEA to accelerate large-scale application modernization efforts across hybrid cloud environments on Red Hat OpenShift.
</para>
</abstract>
<authorgroup>
<orgname>Red Hat Customer Content Services</orgname>
</authorgroup>
<xi:include href="Common_Content/Legal_Notice.xml" xmlns:xi="http://www.w3.org/2001/XInclude" />
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will have to do a re-write.

Please can we look at this together


.Prerequisites

//the hard link must be changed to the same topic in 8.0.0 that has the `{mta-dl-plugin}-database` req.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could use attributes

+
[source, terminal]
----
kubectl create secret generic kai-api-keys -n openshift-mta \
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does the oc equivalent work? just a wondering

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes.

Copy link
Collaborator

@anarnold97 anarnold97 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks really good

Needs a wee bit of polishing but nothing that screams out for massive changes

:context: solution-server-configurations

[role=_abstract]
Solution server is a component that allows {mta-dl-plugin} to build a collective memory of code changes from all analysis performed in an organization. In the local context of an analysis run in the Visual Studio (VS) Code, the Solution Server augments previous patterns of how codes changed to resolve issues (also called solved examples) that were similar to those in the current file, and suggests a resolution that has a higher confidence level derived from previous solutions.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does it make sense to expand a bit about ss?

The Solution Server delivers two primary benefits to users:

- **Contextual Hints:** It surfaces examples of past migration solutions—including successful user modifications and accepted fixes—offering actionable hints for difficult or previously unsolved migration problems.
- **Migration Success Metrics:** It exposes detailed success metrics for each migration rule, derived from real-world usage data. These metrics can be used by IDEs or automation tools to present users with a “confidence level” or likelihood of MTA lightspeed successfully migrating a given code segment.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@savitharaghunathan - thanks... that looks cool enough

[source, terminal]
----
kubectl create secret generic kai-api-keys -n openshift-mta \
--from-literal=genai_key='<YOUR_IBM_GENAI_KEY>'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This key is outdated. What should go in the literal here is whatever environment variables the provider you have configured expects. For example, if you're using openai models, they expect that there will be an environment variable named OPENAI_API_KEY. You would then create the secret with --from-literal=OPENAI_API_KEY='<your openai key>'. Whatever environment variables you configure in the secret like that will be injected directly into the pod.

[source, terminal]
----
kubectl create secret generic kai-api-keys -n openshift-mta \
--from-literal=google_key='<YOUR_GOOGLE_API_KEY>'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GEMINI_API_KEY for google I think

----
kubectl create secret generic kai-api-keys -n openshift-mta \
--from-literal=api_base='https://api.openai.com/v1' \
--from-literal=api_key='<YOUR_OPENAI_KEY>'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OPENAI_API_KEY

[source, terminal]
----
kubectl create secret generic kai-api-keys -n openshift-mta \
--from-literal=api_base='https://api.openai.com/v1' \

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OPENAI_API_BASE (or can be set on the tackle cr itself with the kai_llm_baseurl variable, which I think would be the recommended way)

@Pkylas007 Pkylas007 requested a review from sshveta September 1, 2025 13:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants