Skip to content

New feature : LLM Selection by the user #1

Open
1hemmem wants to merge 5 commits intomahdjourOussama:masterfrom
1hemmem:master
Open

New feature : LLM Selection by the user #1
1hemmem wants to merge 5 commits intomahdjourOussama:masterfrom
1hemmem:master

Conversation

@1hemmem
Copy link
Copy Markdown

@1hemmem 1hemmem commented Feb 8, 2025

In this pull request I have added the ability to choose what Large Language Model we will use in all the workflow.
I have added 2 more LLMs: "deepseek-r1:1.5b" and "llama3.2:1b" as a strating point and tested them.

this is how the user will select in the ui:

2025-02-07-224713_hyprshot

and here is the result when I have tested it with deepseek-r1:1.5b (the tag is specail only for deepseek-r1 models):

2025-02-07-224652_hyprshot

Copy link
Copy Markdown
Owner

@mahdjourOussama mahdjourOussama left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good job i will test it later and I will add the safeguards for API.
great job I have added some comments that you can take into consideration for future PRs

Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@1hemmem its not a good practice to push env file even if the secret are not that useful

Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the best practice is to ignore env file because further integration will requires API keys that we can include


# Define a function to answer queries
def answer_question(question, docs) -> str:
def answer_question(question, docs, model) -> str:
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good approach but this will raise error if the model value is empty and is not supported you need to ensure that the provided model is supported and It has default value

"""Send a conversation to the AI model and return the response."""
try:
query, docs = request.question, request.docs
query, docs, model = request.question, request.docs, request.model
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same issue here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants