Skip to content

Commit 52b7af4

Browse files
New CI workflow and strategy for notebook testing (#56)
1 parent ad7a626 commit 52b7af4

28 files changed

+8651
-7204
lines changed

.github/ignore-notebooks.txt

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
01_crewai_langgraph_redis
2+
doc2cache_llama3_1
3+
semantic_caching_gemini
4+
01_collaborative_filtering
5+
05_nvidia_ai_rag_redis

.github/workflows/nightly-test.yml

Lines changed: 103 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,103 @@
1+
name: Tests - Nightly Run
2+
3+
on:
4+
schedule:
5+
- cron: "0 3 * * *" # 3 AM UTC nightly
6+
workflow_dispatch:
7+
8+
env:
9+
PYTHON_VERSION: "3.11"
10+
11+
jobs:
12+
# ---------------------------------------------------------
13+
# 1) Gather all notebooks (except skip-list)
14+
# ---------------------------------------------------------
15+
gather_all_notebooks:
16+
runs-on: ubuntu-latest
17+
outputs:
18+
notebooks: ${{ steps.get_nbs.outputs.notebooks }}
19+
steps:
20+
- uses: actions/checkout@v2
21+
22+
- id: get_nbs
23+
run: |
24+
# 1) Read ignore patterns from .github/ignore-notebooks.txt
25+
IGNORE_LIST=()
26+
while IFS= read -r skip_nb || [ -n "$skip_nb" ]; do
27+
# Skip empty lines or comment lines
28+
[[ -z "$skip_nb" || "$skip_nb" =~ ^# ]] && continue
29+
IGNORE_LIST+=("$skip_nb")
30+
done < .github/ignore-notebooks.txt
31+
32+
# 2) Find all .ipynb in python-recipes (or your path)
33+
NBS=$(find python-recipes -name '*.ipynb')
34+
35+
# 3) Filter out notebooks that match anything in IGNORE_LIST
36+
FILTERED_NBS=()
37+
for nb in $NBS; do
38+
skip=false
39+
for ignore_nb in "${IGNORE_LIST[@]}"; do
40+
if [[ "$nb" == *"$ignore_nb"* ]]; then
41+
skip=true
42+
break
43+
fi
44+
done
45+
46+
if [ "$skip" = false ]; then
47+
FILTERED_NBS+=("$nb")
48+
fi
49+
done
50+
51+
# 4) Convert the final array to compact JSON for GitHub Actions
52+
NB_JSON=$(printf '%s\n' "${FILTERED_NBS[@]}" \
53+
| jq -R . \
54+
| jq -s -c .)
55+
56+
# 5) Default to an empty array if there's nothing left
57+
if [ -z "$NB_JSON" ] || [ "$NB_JSON" = "[]" ]; then
58+
NB_JSON="[]"
59+
fi
60+
61+
echo "All valid notebooks: $NB_JSON"
62+
echo "notebooks=$NB_JSON" >> $GITHUB_OUTPUT
63+
64+
# ---------------------------------------------------------
65+
# 2) Test all notebooks in parallel
66+
# ---------------------------------------------------------
67+
test_all_notebooks:
68+
needs: gather_all_notebooks
69+
runs-on: ubuntu-latest
70+
strategy:
71+
fail-fast: false
72+
matrix:
73+
notebook: ${{ fromJson(needs.gather_all_notebooks.outputs.notebooks) }}
74+
75+
services:
76+
redis:
77+
image: redis/redis-stack-server:latest
78+
ports:
79+
- 6379:6379
80+
81+
steps:
82+
- uses: actions/checkout@v2
83+
84+
# Setup Python
85+
- uses: actions/setup-python@v4
86+
with:
87+
python-version: ${{ env.PYTHON_VERSION }}
88+
89+
- name: Create and activate venv
90+
run: |
91+
python -m venv venv
92+
source venv/bin/activate
93+
pip install --upgrade pip setuptools wheel
94+
pip install pytest nbval
95+
96+
- name: Test notebook
97+
env:
98+
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
99+
COHERE_API_KEY: ${{ secrets.COHERE_API_KEY }}
100+
run: |
101+
echo "Testing notebook: ${{ matrix.notebook }}"
102+
source venv/bin/activate
103+
pytest --nbval-lax --disable-warnings "${{ matrix.notebook }}"

.github/workflows/test.yml

Lines changed: 93 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -1,55 +1,110 @@
1-
name: Test Suite
1+
name: Tests - PR/Push
22

33
on:
4-
pull_request:
5-
branches:
6-
- main
74
push:
8-
branches:
9-
- main
10-
schedule:
11-
- cron: '0 0 * * 1' # Runs every Monday
5+
branches: [ main ]
6+
pull_request:
7+
branches: [ main ]
8+
9+
env:
10+
PYTHON_VERSION: "3.11"
1211

1312
jobs:
14-
test:
15-
name: Python ${{ matrix.python-version }} - ${{ matrix.connection }} [redis-stack ${{matrix.redis-stack-version}}]
13+
# ---------------------------------------------------------
14+
# 1) Gather the changed notebooks to produce a matrix list
15+
# ---------------------------------------------------------
16+
gather_notebooks:
1617
runs-on: ubuntu-latest
18+
outputs:
19+
notebooks: ${{ steps.get_nbs.outputs.notebooks }}
20+
steps:
21+
- uses: actions/checkout@v2
22+
23+
- id: get_nbs
24+
run: |
25+
# Compare this commit/PR to 'main' and list changed .ipynb files
26+
git fetch --depth=1 origin main
27+
CHANGED_NOTEBOOKS=$(git diff --name-only origin/main | grep '\.ipynb$' || true)
28+
29+
# 1) Read ignore patterns from .github/ignore-notebooks.txt
30+
IGNORE_LIST=()
31+
while IFS= read -r skip_nb || [ -n "$skip_nb" ]; do
32+
# Skip empty lines or comment lines
33+
[[ -z "$skip_nb" || "$skip_nb" =~ ^# ]] && continue
34+
IGNORE_LIST+=("$skip_nb")
35+
done < .github/ignore-notebooks.txt
36+
37+
# 2) Filter out notebooks in CHANGED_NOTEBOOKS that match ignore patterns
38+
FILTERED_NBS=()
39+
for nb in $CHANGED_NOTEBOOKS; do
40+
skip=false
41+
42+
# Check if in ignore list
43+
for ignore_nb in "${IGNORE_LIST[@]}"; do
44+
# Partial match:
45+
if [[ "$nb" == *"$ignore_nb"* ]]; then
46+
skip=true
47+
break
48+
fi
49+
done
50+
51+
if [ "$skip" = false ]; then
52+
FILTERED_NBS+=("$nb")
53+
fi
54+
done
55+
56+
# 3) Build a single-line JSON array
57+
NB_JSON=$(printf '%s\n' "${FILTERED_NBS[@]}" \
58+
| jq -R . \
59+
| jq -s -c .)
60+
61+
# 4) Fallback to an empty array if there's nothing left
62+
if [ -z "$NB_JSON" ] || [ "$NB_JSON" = "[]" ]; then
63+
NB_JSON="[]"
64+
fi
65+
66+
echo "All valid notebooks: $NB_JSON"
67+
68+
# 5) Write to $GITHUB_OUTPUT (modern approach instead of ::set-output)
69+
echo "notebooks=$NB_JSON" >> $GITHUB_OUTPUT
1770

71+
# ---------------------------------------------------------
72+
# 2) Test each changed notebook in parallel
73+
# ---------------------------------------------------------
74+
test_notebooks:
75+
needs: gather_notebooks
76+
runs-on: ubuntu-latest
1877
strategy:
1978
fail-fast: false
2079
matrix:
21-
python-version: [3.11]
22-
connection: ['plain']
23-
redis-stack-version: ['latest']
80+
notebook: ${{ fromJson(needs.gather_notebooks.outputs.notebooks) }}
2481

2582
services:
2683
redis:
27-
image: redis/redis-stack-server:${{matrix.redis-stack-version}}
84+
image: redis/redis-stack-server:latest
2885
ports:
2986
- 6379:6379
3087

3188
steps:
32-
- uses: actions/checkout@v2
33-
- name: Set up Python ${{ matrix.python-version }}
34-
uses: actions/setup-python@v4
35-
with:
36-
python-version: ${{ matrix.python-version }}
37-
cache: 'pip'
38-
39-
- name: Install dependencies
40-
run: |
41-
pip install --no-cache-dir -r requirements.txt
42-
43-
- name: Set Redis version
44-
run: |
45-
echo "REDIS_VERSION=${{ matrix.redis-stack-version }}" >> $GITHUB_ENV
46-
47-
- name: Run notebooks
48-
if: matrix.connection == 'plain' && matrix.redis-stack-version == 'latest'
49-
env:
50-
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
51-
LLAMA_CLOUD_API_KEY: ${{ secrets.LLAMA_CLOUD_API_KEY }}
52-
GCP_REGION: ${{ secrets.GCP_REGION }}
53-
GCP_PROJECT_ID: ${{ secrets.GCP_PROJECT_ID }}
54-
run: |
55-
pytest --verbose --nbval-lax python-recipes/RAG/ python-recipes/vector-search python-recipes/redis-intro python-recipes/recommendation-systems python-recipes/agents python-recipes/computer-vision --ignore python-recipes/agents/01_crewai_langgraph_redis.ipynb --ignore python-recipes/RAG/05_nvidia_ai_rag_redis.ipynb --ignore python-recipes/semantic-cache/doc2cache_llama3_1.ipynb
89+
- uses: actions/checkout@v2
90+
91+
# Setup Python
92+
- uses: actions/setup-python@v4
93+
with:
94+
python-version: ${{ env.PYTHON_VERSION }}
95+
96+
- name: Create and activate venv
97+
run: |
98+
python -m venv venv
99+
source venv/bin/activate
100+
pip install --upgrade pip setuptools wheel
101+
pip install pytest nbval
102+
103+
- name: Test notebook
104+
env:
105+
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
106+
COHERE_API_KEY: ${{ secrets.COHERE_API_KEY }}
107+
run: |
108+
echo "Testing notebook: ${{ matrix.notebook }}"
109+
source venv/bin/activate
110+
pytest --nbval-lax --disable-warnings "${{ matrix.notebook }}"

README.md

Lines changed: 7 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -128,28 +128,13 @@ Redis integrates with many different players in the AI ecosystem. Here's a curat
128128
| [DocArray](https://docs.docarray.org/user_guide/storing/index_redis/) | DocArray Integration of Redis as a VectorDB by Jina AI |
129129

130130

131-
## Content
132-
- [Vector Similarity Search: From Basics to Production](https://mlops.community/vector-similarity-search-from-basics-to-production/) - Introductory blog post to VSS and Redis as a VectorDB.
133-
- [Improving RAG quality with RAGAs](https://redis.io/blog/get-better-rag-responses-with-ragas/)
134-
- [Level-up RAG with RedisVL](https://redis.io/blog/level-up-rag-apps-with-redis-vector-library/)
131+
# Other Helpful Resources
132+
135133
- [Vector Databases and Large Language Models](https://youtu.be/GJDN8u3Y-T4) - Talk given at LLMs in Production Part 1 by Sam Partee.
134+
- [Level-up RAG with RedisVL](https://redis.io/blog/level-up-rag-apps-with-redis-vector-library/)
135+
- [Improving RAG quality with RAGAs](https://redis.io/blog/get-better-rag-responses-with-ragas/)
136136
- [Vector Databases and AI-powered Search Talk](https://www.youtube.com/watch?v=g2bNHLeKlAg) - Video "Vector Databases and AI-powered Search" given by Sam Partee at SDSC 2023.
137-
- [Real-Time Product Recommendations](https://jina.ai/news/real-time-product-recommendation-using-redis-and-docarray/) - Content-based recsys design with Redis and DocArray.
138-
- [NVIDIA Recsys with Redis](https://developer.nvidia.com/blog/offline-to-online-feature-storage-for-real-time-recommendation-systems-with-nvidia-merlin/)
139-
- [LabLab AI Redis Tech Page](https://lablab.ai/tech/redis)
140-
- [Storing and querying for embeddings with Redis](https://blog.baeke.info/2023/03/21/storing-and-querying-for-embeddings-with-redis/)
141-
- [Building Intelligent Apps with Redis Vector Similarity Search](https://redis.com/blog/build-intelligent-apps-redis-vector-similarity-search/)
142-
- [RedisDays Trading Signals](https://www.youtube.com/watch?v=_Lrbesg4DhY) - Video "Using AI to Reveal Trading Signals Buried in Corporate Filings".
143-
144-
## Benchmarks
137+
- [NVIDIA RecSys with Redis](https://developer.nvidia.com/blog/offline-to-online-feature-storage-for-real-time-recommendation-systems-with-nvidia-merlin/)
145138
- [Benchmarking results for vector databases](https://redis.io/blog/benchmarking-results-for-vector-databases/) - Benchmarking results for vector databases, including Redis and 7 other Vector Database players.
146-
- [ANN Benchmarks](https://ann-benchmarks.com) - Standard ANN Benchmarks site. *Only using single Redis OSS instance/client.*
147-
148-
## Docs
149-
- [Redis Vector Library Docs](https://redisvl.com)
150-
- [Redis Vector Database QuickStart](https://redis.io/docs/get-started/vector-database/)
151-
- [Redis Vector Similarity Docs](https://redis.io/docs/interact/search-and-query/advanced-concepts/vectors/) - Official Redis literature for Vector Similarity Search.
152-
- [Redis-py Search Docs](https://redis.readthedocs.io/en/latest/redismodules.html#redisearch-commands) - Redis-py client library docs for RediSearch.
153-
- [Redis-py General Docs](https://redis.readthedocs.io/en/latest/) - Redis-py client library documentation.
154-
- [Redis Stack](https://redis.io/docs/stack/) - Redis Stack documentation.
155-
- [Redis Clients](https://redis.io/docs/clients/) - Redis client list.
139+
- [Redis Vector Library Docs](https://docs.redisvl.com)
140+
- [Redis Vector Search API Docs](https://redis.io/docs/interact/search-and-query/advanced-concepts/vectors/) - Official Redis literature for Vector Similarity Search.

contributing.md

Lines changed: 0 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -11,17 +11,6 @@ Open a PR with your addition. We expect the following standards:
1111
3. New additions should be added to the bottom of the list (unless otherwise noted).
1212
4. New additions should not contain any profanity or offensive language.
1313

14-
### What it takes to get a Star
15-
16-
When reviewing the PR, we will determine whether a new entry gets a star!
17-
18-
Examples that:
19-
- are well-documented and easy to follow
20-
- pertain to a new or creative use case
21-
- follow good coding/writing hygiene
22-
23-
will be considered for getting a special star ⭐.
24-
2514
## Updating your Pull Request
2615

2716
Sometimes, a maintainer will ask you to edit your Pull Request before it is included. This is normally due to spelling errors or because your PR didn't match the list format.

0 commit comments

Comments
 (0)