Skip to content

Feat: Perplexity support#42

Closed
joaoGabriel55 wants to merge 18 commits intocrmne:mainfrom
joaoGabriel55:feat/add-perplexity-provider
Closed

Feat: Perplexity support#42
joaoGabriel55 wants to merge 18 commits intocrmne:mainfrom
joaoGabriel55:feat/add-perplexity-provider

Conversation

@joaoGabriel55
Copy link
Copy Markdown

@joaoGabriel55 joaoGabriel55 commented Mar 18, 2025

Issue

#20

Description

This PR consists on add Perplexity API support

@joaoGabriel55
Copy link
Copy Markdown
Author

Please, let me know if there is something missing on this first implementation

Question: How can I run tests locally?

@adenta
Copy link
Copy Markdown

adenta commented Mar 18, 2025

How do I test?
You should be able to set API keys locally and run rspec to test the application.

@joaoGabriel55 I started a pull request here. Do you want to either tweak what I have, or incorporate my changes into your branch?

@gquaresma-godaddy
Copy link
Copy Markdown

@adenta let me know if there is something to improve

@adenta
Copy link
Copy Markdown

adenta commented Mar 19, 2025

@gquaresma-godaddy we need tests! Want to take a crack at it?

@gquaresma-godaddy
Copy link
Copy Markdown

Sure! I will work on it.

Copy link
Copy Markdown
Owner

@crmne crmne left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very good. It looks like sonar has support for vision, so it would need to be added to the chat_content_spec vision test, and the chat_streaming_spec.

@crmne crmne mentioned this pull request Mar 21, 2025
@crmne crmne added the new provider New provider integration label Mar 21, 2025
@crmne crmne linked an issue Mar 23, 2025 that may be closed by this pull request
@crmne
Copy link
Copy Markdown
Owner

crmne commented Mar 23, 2025

Two new features to consider in your provider implementation:

  1. Model Aliases: Please add entries for your provider in aliases.json:

    "claude-3-5-sonnet": {
      "anthropic": "claude-3-5-sonnet-20241022",
      "your-provider": "your-provider-specific-id"
    }
  2. Provider Selection: Users will be able to specify your provider:

    chat = RubyLLM.chat(model: 'claude-3-5-sonnet', provider: 'your-provider')

Docs: https://rubyllm.com/guides/models#using-model-aliases

@crmne
Copy link
Copy Markdown
Owner

crmne commented Mar 25, 2025

Added configuration requirements handling in 75f99a1

Each provider now specifies what configuration is required via a simple configuration_requirements method (you will need to implement this in your main provider file) that returns an array of config keys as symbols. The Provider module uses this to:

  1. Determine if a provider is properly configured
  2. Throw an error if you're trying to use that provider without configuration
  3. Include ready-to-paste configuration code in the error message
  4. Skip unconfigured providers during model refresh while preserving their models

Example of the new error messages:

RubyLLM::ConfigurationError: anthropic provider is not configured. Add this to your initialization:

RubyLLM.configure do |config|
  config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
end

@crmne
Copy link
Copy Markdown
Owner

crmne commented Apr 17, 2025

@joaoGabriel55 is this still on your radar? I'd love to merge Perplexity support soon. Whenever you're ready, could you resolve the conflicts and request a review?

@joaoGabriel55 joaoGabriel55 requested a review from crmne April 17, 2025 12:32
Copy link
Copy Markdown
Owner

@crmne crmne left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your work!

It looks generally good, I just need some changes and tests. Please pick the cheapest model to test with.

Comment thread lib/ruby_llm/providers/perplexity.rb
Comment thread README.md Outdated
Comment thread docs/installation.md Outdated
Comment thread lib/ruby_llm/providers/perplexity/chat.rb Outdated
Comment thread lib/ruby_llm/providers/perplexity/chat.rb
Comment thread lib/ruby_llm/providers/perplexity/streaming.rb Outdated
Comment thread spec/spec_helper.rb
@joaoGabriel55 joaoGabriel55 requested a review from crmne April 23, 2025 22:54
@crmne
Copy link
Copy Markdown
Owner

crmne commented May 6, 2025

hi @joaoGabriel55 are you able to provide some VCRs for this?

@joaoGabriel55
Copy link
Copy Markdown
Author

hi @joaoGabriel55 are you able to provide some VCRs for this?
While running the tests, I got this error:

5) RubyLLM::Chat function calling perplexity/sonar can use tools with multi-turn streaming conversations
     Failure/Error: raise UnsupportedFunctionsError, "Model #{@model.id} doesn't support function calling"
     
     RubyLLM::UnsupportedFunctionsError:
       Model sonar doesn't support function calling
     # ./lib/ruby_llm/chat.rb:50:in 'RubyLLM::Chat#with_tool'
     # ./spec/ruby_llm/chat_tools_spec.rb:110:in 'block (4 levels) in <top (required)>'
     # ./spec/spec_helper.rb:86:in 'block (3 levels) in <top (required)>'
     # /Users/quaresma/.rvm/gems/ruby-3.4.1/gems/vcr-6.3.1/lib/vcr/util/variable_args_block_caller.rb:9:in 'VCR::VariableArgsBlockCaller#call_block'
     # /Users/quaresma/.rvm/gems/ruby-3.4.1/gems/vcr-6.3.1/lib/vcr.rb:194:in 'VCR#use_cassette'
     # ./spec/spec_helper.rb:85:in 'block (2 levels) in <top (required)>'
     # /Users/quaresma/.rvm/gems/ruby-3.4.1/gems/webmock-3.25.1/lib/webmock/rspec.rb:39:in 'block (2 levels) in <top (required)>'

Any ideas about that 😅 ? This is the sonar model config from the models.json file:

  {
    "id": "sonar",
    "created_at": null,
    "display_name": "Sonar",
    "provider": "perplexity",
    "context_window": 128000,
    "max_tokens": 4096,
    "type": "chat",
    "family": "sonar",
    "supports_vision": true,
    "supports_functions": false,
    "supports_json_mode": true,
    "input_price_per_million": 1.0,
    "output_price_per_million": 1.0,
    "metadata": {
      "description": "Lightweight offering with search grounding, quicker and cheaper than Sonar Pro."
    }
  },

Comment thread spec/spec_helper.rb Outdated
{ provider: :openrouter, model: 'anthropic/claude-3.5-haiku' },
{ provider: :ollama, model: 'mistral-small3.1' }
{ provider: :ollama, model: 'mistral-small3.1' },
{ provider: :perplexity, model: 'gpt-4.1-nano', }
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@joaoGabriel55 I think if you remove perplexity from this array it shouldn't be considered a CHAT_MODEL and wont case the spec failures.

The errors are coming becasue the model doesn't support function calling and right now, the tests are assuming everything does.

Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@adenta no. It is a chat model so it stays in there. Plenty of models and providers have quirks that we already skip. Skipping means that when they fix their quirks we simply remove the skip statement. On top of that if you now have this specific model out of the CHAT_MODELS array, you need to duplicate all the tests only for Perplexity.

@crmne
Copy link
Copy Markdown
Owner

crmne commented Jun 2, 2025

Hi @joaoGabriel55 simply skip the test in that case. There are plenty of skipped tests examples because of quirks with specific models.

@crmne
Copy link
Copy Markdown
Owner

crmne commented Jun 10, 2025

Hey @joaoGabriel55 I feel like we're really close to the finish line here! There are only some failing tests (please make sure you have done overcommit --install before committing) and some merge conflicts.

Really looking forward to having this ship with 1.4!

@crmne
Copy link
Copy Markdown
Owner

crmne commented Jul 31, 2025

went ahead and implemented perplexity myself: 6f2e2a1

@crmne crmne closed this Jul 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

new provider New provider integration

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Perplexity support

4 participants