Skip to content

Commit 6e2a1b3

Browse files
authored
Add cometapi integration (#363)
* Add google-genai integration * current README should be instead placed as a comet-api.md file * Fix formatting of CometAPIChatGenerator reference in usage section
1 parent ec8541d commit 6e2a1b3

File tree

2 files changed

+98
-0
lines changed

2 files changed

+98
-0
lines changed

integrations/comet-api.md

Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
---
2+
layout: integration
3+
name: Comet API
4+
description: Use the Comet API for text generation models.
5+
authors:
6+
- name: deepset
7+
socials:
8+
github: deepset-ai
9+
twitter: deepset_ai
10+
linkedin: https://www.linkedin.com/company/deepset-ai/
11+
- name: Gary Badwal
12+
socials:
13+
website: garybadwal.com
14+
github: garybadwal
15+
twitter: garybadwal_
16+
linkedin: https://www.linkedin.com/in/garybadwal/
17+
pypi: https://pypi.org/project/cometapi-haystack
18+
repo: https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/cometapi
19+
type: Model Provider
20+
report_issue: https://github.com/deepset-ai/haystack-core-integrations/issues
21+
logo: /logos/cometapi.png
22+
version: Haystack 2.0
23+
toc: true
24+
---
25+
26+
### **Table of Contents**
27+
- [Overview](#overview)
28+
- [Installation](#installation)
29+
- [Usage](#usage)
30+
- [License](#license)
31+
32+
## Overview
33+
34+
`CometAPIChatGenerator` lets you call any LLMs available on [Comet API](https://cometapi.com), including:
35+
36+
- OpenAI variants such as `gpt-5`
37+
- Anthropic’s `claude-4.5-haiku`
38+
- Community-hosted open-source models (Llama 2, Mixtral, etc.)
39+
40+
For more information on models available via the Comet API API, see [the Comet API docs](https://www.cometapi.com/model/).
41+
42+
43+
In order to follow along with this guide, you'll need a Comet API key. Add it as an environment variable, `COMET_API_KEY`.
44+
45+
## Installation
46+
47+
```bash
48+
pip install cometapi-haystack
49+
```
50+
51+
## Usage
52+
You can use `CometAPIChatGenerator` as standalone, within a [pipeline](https://docs.haystack.deepset.ai/docs/pipelines) or with the [Agent component](https://docs.haystack.deepset.ai/docs/agent).
53+
54+
Here's an example of using it as a standalone component:
55+
56+
```python
57+
import os
58+
from haystack.dataclasses import ChatMessage
59+
from haystack_integrations.components.generators.cometapi import CometAPIChatGenerator
60+
61+
os.environ["COMET_API_KEY"] = "YOUR_COMET_API_KEY"
62+
63+
client = CometAPIChatGenerator() # defaults to gpt-4o-mini
64+
response = client.run(
65+
[ChatMessage.from_user("What are Agentic Pipelines? Be brief.")]
66+
)
67+
print(response["replies"])
68+
69+
```
70+
```bash
71+
{'replies': [ChatMessage(_role=<ChatRole.ASSISTANT: 'assistant'>, _content=[TextContent(text='Agentic Pipelines refer to processes or frameworks that enable individuals or groups to take proactive control over their learning, decision-making, or actions in a systematic way. They emphasize agency, allowing participants to navigate pathways that reflect their interests, goals, and capabilities, often leveraging technology and resources to facilitate this empowerment. In various contexts, such as education or organizational development, Agentic Pipelines can foster greater engagement, autonomy, and outcomes.')], _name=None, _meta={'model': 'gpt-4o-mini-2024-07-18', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 87, 'prompt_tokens': 17, 'total_tokens': 104, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}})]}
72+
```
73+
`CometAPIChatGenerator` also support streaming responses if you pass a streaming callback:
74+
75+
```python
76+
import os
77+
from haystack.dataclasses import ChatMessage
78+
from haystack_integrations.components.generators.cometapi import CometAPIChatGenerator
79+
80+
os.environ["COMET_API_KEY"] = "YOUR_COMET_API_KEY"
81+
82+
def show(chunk): # simple streaming callback
83+
print(chunk.content, end="", flush=True)
84+
85+
client = CometAPIChatGenerator(
86+
model="grok-3-mini",
87+
streaming_callback=show,
88+
)
89+
90+
response = client.run([ChatMessage.from_user("Summarize RAG in two lines.")])
91+
92+
print(response)
93+
94+
```
95+
96+
### License
97+
98+
`cometapi-haystack` is distributed under the terms of the [Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) license.

logos/cometapi.png

15 KB
Loading

0 commit comments

Comments
 (0)