You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|**Description** |A provider that is directly in the Llama Stack code|A provider that os outside of the Llama stack core codebase but is still accessible and usable by Llama Stack.
22
+
|**Benefits** |Ability to interact with the provider with minimal additional configurations or installations| Contributors do not have to add directly to the code to create providers accessible on Llama Stack. Keep provider-specific code separate from the core Llama Stack code
23
+
|**Which one is best for you?** |Its best to create a internal provider when|You should create an external provider when
24
+
17
25
## Inference Provider Patterns
18
26
19
27
When implementing Inference providers for OpenAI-compatible APIs, Llama Stack provides several mixin classes to simplify development and ensure consistent behavior across providers.
0 commit comments