Skip to content

Commit cf94fd7

Browse files
committed
Add AIMLAPI workflow and update examples
Introduces a new GitHub Actions workflow for AIMLAPI integration testing. Updates example scripts to add noqa comments for print statements and refines message list construction. Modifies AIMLAPIChatGenerator to include an explicit 'openai_endpoint' parameter and clarifies a type ignore comment.
1 parent 73b3e07 commit cf94fd7

File tree

4 files changed

+93
-8
lines changed

4 files changed

+93
-8
lines changed

.github/workflows/aimlapi.yml

Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
# This workflow comes from https://github.com/ofek/hatch-mypyc
2+
# https://github.com/ofek/hatch-mypyc/blob/5a198c0ba8660494d02716cfc9d79ce4adfb1442/.github/workflows/test.yml
3+
name: Test / aimlapi
4+
5+
on:
6+
schedule:
7+
- cron: "0 0 * * *"
8+
pull_request:
9+
paths:
10+
- "integrations/aimlapi/**"
11+
- "!integrations/aimlapi/*.md"
12+
- ".github/workflows/aimlapi.yml"
13+
14+
defaults:
15+
run:
16+
working-directory: integrations/aimlapi
17+
18+
concurrency:
19+
group: aimlapi-${{ github.head_ref }}
20+
cancel-in-progress: true
21+
22+
env:
23+
PYTHONUNBUFFERED: "1"
24+
FORCE_COLOR: "1"
25+
AIMLAPI_API_KEY: ${{ secrets.AIMLAPI_API_KEY }}
26+
27+
jobs:
28+
run:
29+
name: Python ${{ matrix.python-version }} on ${{ startsWith(matrix.os, 'macos-') && 'macOS' || startsWith(matrix.os, 'windows-') && 'Windows' || 'Linux' }}
30+
runs-on: ${{ matrix.os }}
31+
strategy:
32+
fail-fast: false
33+
matrix:
34+
os: [ubuntu-latest, windows-latest, macos-latest]
35+
python-version: ["3.9", "3.13"]
36+
37+
steps:
38+
- name: Support longpaths
39+
if: matrix.os == 'windows-latest'
40+
working-directory: .
41+
run: git config --system core.longpaths true
42+
43+
- uses: actions/checkout@v5
44+
45+
- name: Set up Python ${{ matrix.python-version }}
46+
uses: actions/setup-python@v6
47+
with:
48+
python-version: ${{ matrix.python-version }}
49+
50+
- name: Install Hatch
51+
run: pip install --upgrade hatch
52+
53+
- name: Lint
54+
if: matrix.python-version == '3.9' && runner.os == 'Linux'
55+
run: hatch run fmt-check && hatch run test:types
56+
57+
- name: Generate docs
58+
if: matrix.python-version == '3.9' && runner.os == 'Linux'
59+
run: hatch run docs
60+
61+
- name: Run tests
62+
run: hatch run test:cov-retry
63+
64+
- name: Run unit tests with lowest direct dependencies
65+
run: |
66+
hatch run uv pip compile pyproject.toml --resolution lowest-direct --output-file requirements_lowest_direct.txt
67+
hatch run uv pip install -r requirements_lowest_direct.txt
68+
hatch run test:unit
69+
70+
# Since this integration inherits from OpenAIChatGenerator, we run ALL tests with Haystack main branch to catch regressions
71+
- name: Nightly - run tests with Haystack main branch
72+
if: github.event_name == 'schedule'
73+
run: |
74+
hatch env prune
75+
hatch run uv pip install git+https://github.com/deepset-ai/haystack.git@main
76+
hatch run test:all
77+
78+
- name: Send event to Datadog for nightly failures
79+
if: failure() && github.event_name == 'schedule'
80+
uses: ./.github/actions/send_failure
81+
with:
82+
title: |
83+
Core integrations nightly tests failure: ${{ github.workflow }}
84+
api-key: ${{ secrets.CORE_DATADOG_API_KEY }}

integrations/aimlapi/examples/aimlapi_basic_example.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ def main() -> None:
2323

2424
reply = generator.run(messages=messages)["replies"][0]
2525

26-
print(f"assistant response: {reply.text}")
26+
print(f"assistant response: {reply.text}") # noqa: T201
2727

2828

2929
if __name__ == "__main__":

integrations/aimlapi/examples/aimlapi_with_tools_example.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ def main() -> None:
4646
ChatMessage.from_user("What's the weather in Tokyo today?"),
4747
]
4848

49-
print("Requesting a tool call from the model...")
49+
print("Requesting a tool call from the model...") # noqa: T201
5050
tool_request = client.run(
5151
messages=messages,
5252
tools=[weather_tool],
@@ -55,26 +55,26 @@ def main() -> None:
5555
},
5656
)["replies"][0]
5757

58-
print(f"assistant tool request: {tool_request}")
58+
print(f"assistant tool request: {tool_request}") # noqa: T201
5959

6060
if not tool_request.tool_calls:
61-
print("No tool call was produced by the model.")
61+
print("No tool call was produced by the model.") # noqa: T201
6262
return
6363

6464
tool_messages = tool_invoker.run(messages=[tool_request])["tool_messages"]
6565
for tool_message in tool_messages:
6666
for tool_result in tool_message.tool_call_results:
67-
print(f"tool output: {tool_result.result}")
67+
print(f"tool output: {tool_result.result}") # noqa: T201
6868

69-
follow_up_messages = messages + [tool_request, *tool_messages]
69+
follow_up_messages = [*messages, tool_request, *tool_messages]
7070

7171
final_reply = client.run(
7272
messages=follow_up_messages,
7373
tools=[weather_tool],
7474
generation_kwargs={"tool_choice": "none"},
7575
)["replies"][0]
7676

77-
print(f"assistant final answer: {final_reply.text}")
77+
print(f"assistant final answer: {final_reply.text}") # noqa: T201
7878

7979

8080
if __name__ == "__main__":

integrations/aimlapi/src/haystack_integrations/components/generators/aimlapi/chat/chat_generator.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,10 +189,11 @@ def _prepare_api_call(
189189

190190
return {
191191
"model": self.model,
192-
"messages": aimlapi_formatted_messages,
192+
"messages": aimlapi_formatted_messages, # type: ignore[arg-type] # openai expects list of specific message types
193193
"stream": streaming_callback is not None,
194194
"n": num_responses,
195195
**aimlapi_tools,
196196
"extra_body": {**generation_kwargs},
197197
"extra_headers": {**extra_headers},
198+
"openai_endpoint": "create",
198199
}

0 commit comments

Comments
 (0)