A Retrieval-Augmented Generation (RAG) exploration project built with Ruby on Rails. This project demonstrates how to combine vector search (via Meilisearch & pgvector) with Google Gemini embeddings to build a question-answering system over custom documents.
Traditional LLM flow:
User → Prompt → AI → Answer → User
With RAG:
User → Embedding Model → Search Vector DB → Retrieved Context → AI → Grounded Answer → User
RAG grounds the AI's responses in your own documents, reducing hallucination and giving the model access to private or up-to-date knowledge it wasn't trained on.
Documents (txt/pdf)
│
▼
Gemini Embedding API (gemini-embedding-001, 768 dimensions)
│
▼
Meilisearch (document store + full-text search) ←──── User Question
│ │
└──────────── Top K hits (context) ────────────────────┘
│
▼
Gemini LLM (gemini-2.5-flash-lite)
│
▼
Final Answer
| Layer | Technology |
|---|---|
| Framework | Ruby on Rails 8.1 |
| LLM | Google Gemini 2.5 Flash Lite |
| Embeddings | Gemini Embedding 001 (768-dim) |
| Search / Retrieval | Meilisearch |
| Vector DB | pgvector (PostgreSQL extension) |
| LLM Abstraction | langchainrb |
| Background Jobs | Sidekiq + Sidekiq Cron |
| Deployment | Kamal (Docker) |
- Ruby 3.x (
cat .ruby-version) - PostgreSQL with
pgvectorextension - Meilisearch server running locally
- Google Gemini API key
git clone https://github.com/greatengineer19/RAGburnt.git
cd RAGburnt
bundle installCopy .env and fill in your keys:
cp .env .env.localGEMINI_API_KEY=your_gemini_api_key
MEILISEARCH_MASTER_KEY=your_meilisearch_key
DATABASE_URL=postgres://user:password@localhost:5432/rubyrag_developmentDownload and run the Meilisearch binary locally (do not commit it — it's in .gitignore):
# Install via Homebrew (recommended)
brew install meilisearch
# Or download binary from https://github.com/meilisearch/meilisearch/releases
# Then run:
meilisearch --master-key your_meilisearch_keyrails db:create db:migratebin/devThe main RAG demo is in script/meilisearch_rag_impl.rb. It:
- Reads
gt_new_zealand.txt(a travel guide document) - Splits it into chunks
- Generates Gemini embeddings for each chunk
- Indexes them into Meilisearch
- Searches for the most relevant chunks to a question
- Sends context + question to Gemini for a grounded answer
rails runner script/meilisearch_rag_impl.rbExample question: "How many days do we need to travel in New Zealand?"
├── app/
│ ├── services/
│ │ └── gemini_service.rb # Gemini embedding + chat wrapper
│ └── ...
├── config/
│ └── initializers/
│ └── gemini.rb # Model name constants
├── script/
│ └── meilisearch_rag_impl.rb # Main RAG pipeline script
├── gt_new_zealand.txt # Sample document (New Zealand travel guide)
└── temp.rb # pgvector + langchainrb exploration notes
embed(text)— Calls Gemini Embedding API directly (768-dim vectors)answer(query:, hits:)— Builds context from search hits and queries Gemini LLM
- Connects to local Meilisearch instance
- Indexes documents with embeddings
- Performs full-text search and returns top-K hits