One CLI to search every UK property portal.
Your AI agent can now find properties, track prices, and analyze the market across Rightmove, Zoopla, ESPC, and more — with clean commands and normalized JSON output.
Why • Quick Start • Portals • Use Cases • Deduplication • Smart Search
If you're building AI agents for property search, there's a gap: UK property portals offer zero APIs.
Rightmove, Zoopla, OnTheMarket — none of them provide developer APIs. No OAuth, no REST endpoints, no webhooks. If you want your agent to search properties, track prices, or analyze the market, there's no official way to do it.
But agents need property data. Investment analysis, price tracking, market research — these are perfect agent workflows. The infrastructure just doesn't exist.
UK Property CLI closes that gap. Reverse-engineered parsers that give your agent a unified command-line interface to every major UK property portal. Your agent calls python3 parsers/rightmove.py 4 and gets normalized JSON whether you're searching Rightmove, Zoopla, or ESPC.
Built for agent frameworks like OpenClaw, Pi, Claude Desktop MCP. Works with any agent that can shell out to a CLI. Your agent handles the intelligence (filtering, ranking, alerting). The CLI handles the grunt work (fetching, parsing, normalizing).
Clean data, zero dependencies, production-ready.
Trigger this CLI when users:
- 🏡 Ask about properties for sale anywhere in the UK
- 💰 Want to find investment opportunities
- 👨👩👧👦 Need family homes (4+ bedrooms)
- 📊 Request property market analysis
- 🔔 Want price drop alerts
- 📍 Ask "what's on the market in Manchester?" or "find homes in Edinburgh"
- 🤖 Build automated property workflows
- 🔍 Multi-portal search — ESPC, Rightmove, Zoopla in one tool
- 📊 Normalized JSON — Consistent output format across all portals
- 🚫 Area filtering — Automatically excludes undesirable areas
- 🏷️ Smart categorization — Investment vs family homes
- 🆓 Zero dependencies — Just curl + Python (2/3 parsers)
- 📈 95% UK coverage — Comprehensive market view
- 🤖 AI agent ready — Built for automation
- 📝 Beautiful docs — Examples, Block Kit templates, integration guides
The CLI provides property data. Your agent builds the intelligence.
Your agent can run on any platform (Slack, Telegram, WhatsApp, iMessage, Discord, etc.) and use these tools to build:
- 🔔 Daily briefings — Morning summaries of new listings (see example)
- 🚨 Smart alerts — Price drops, value opportunities (see example)
- 🔄 Deduplication — Merge properties across portals (see example)
- 📊 Market analysis — Area statistics, trends (see example)
- 🎯 Smart scoring — Rank by preferences (see example)
- 📅 Viewing automation — Schedule appointments (see example)
The CLI outputs JSON. Your agent formats for your platform:
- Slack → Block Kit
- Telegram → Markdown + inline buttons
- WhatsApp → Rich text + image links
- iMessage → Plain text + links
- Discord → Embeds
# Clone repository
git clone https://github.com/abracadabra50/uk-property-cli.git
cd uk-property-cli
# No installation needed! Just Python 3 + curl (pre-installed on Mac/Linux)
# Search UK properties (4+ beds)
python3 parsers/rightmove.py 4 # UK-wide (1.5M listings)
python3 parsers/zoopla.py 4 # UK-wide + sold prices
python3 parsers/espc.py 4 # Edinburgh specialist
# Or fetch from all portals
./fetch.sh all 4# Install Firecrawl CLI
npm install -g @mendable/firecrawl-cli
# Set API key
export FIRECRAWL_API_KEY=your_key_here
# Now Zoopla works
python3 parsers/zoopla.py 4The CLI provides unified property data:
python3 parsers/<portal>.py <min_beds>All parsers return the same JSON structure. Your agent doesn't care which portal you're using.
python3 parsers/espc.py 4 # Edinburgh specialist
python3 parsers/rightmove.py 4 # UK-wide coverage
python3 parsers/zoopla.py 4 # Sold price dataUnder the hood:
Each parser implements a common interface:
def fetch_page(beds: str) -> str:
"""Fetch search results from portal"""
def parse_properties(html: str) -> List[Property]:
"""Extract property data from HTML/JSON"""
def categorize(price: int, beds: int) -> str:
"""Tag as investment/family/other"""The output is normalized JSON. Your agent handles the intelligence (filtering, ranking, alerts). The CLI handles the grunt work (fetching, parsing, normalizing).
The problem: Same property appears on multiple portals (Rightmove + Zoopla + ESPC).
Agent workflow:
# User: "Show me properties without duplicates"
# 1. Fetch from all portals
espc = fetch_properties('espc', beds=4)
rightmove = fetch_properties('rightmove', beds=4)
zoopla = fetch_properties('zoopla', beds=4)
all_properties = espc + rightmove + zoopla # 57 properties
# 2. Normalize addresses for matching
def normalize_address(addr):
# Remove punctuation, extra spaces, common variations
addr = addr.lower()
addr = addr.replace(',', '').replace('.', '')
addr = addr.replace(' street', ' st').replace(' road', ' rd')
addr = ' '.join(addr.split()) # Normalize whitespace
return addr
# 3. Calculate similarity between addresses
from difflib import SequenceMatcher
def addresses_match(addr1, addr2, threshold=0.85):
norm1 = normalize_address(addr1)
norm2 = normalize_address(addr2)
similarity = SequenceMatcher(None, norm1, norm2).ratio()
return similarity >= threshold
# 4. Deduplicate with confidence scores
unique_properties = []
seen_addresses = []
for prop in all_properties:
is_duplicate = False
for seen_addr in seen_addresses:
if addresses_match(prop['address'], seen_addr):
is_duplicate = True
break
if not is_duplicate:
unique_properties.append(prop)
seen_addresses.append(prop['address'])
# Result: 57 → 38 unique properties (35% were duplicates)
# 5. When duplicates found, merge data
def merge_duplicates(properties):
"""
When same property found on multiple portals,
combine best data from each source
"""
merged = []
for prop in unique_properties:
# Find all versions of this property
versions = [p for p in all_properties
if addresses_match(p['address'], prop['address'])]
# Merge data - take best from each portal
merged_prop = {
'id': prop['id'],
'address': prop['address'],
'price': min(p['price'] for p in versions if p['price'] > 0), # Lowest price
'beds': max(p['beds'] for p in versions), # Most beds listed
'baths': max(p['baths'] for p in versions), # Most baths listed
'images': list(set(sum([p['images'] for p in versions], []))), # All unique images
'portals': [p['portal'] for p in versions], # Which portals have it
'urls': {p['portal']: p['url'] for p in versions} # All URLs
}
merged.append(merged_prop)
return merged
deduplicated = merge_duplicates(all_properties)
# Show user clean results
show_properties(deduplicated) # 38 unique properties with merged dataAgent workflow:
# User: "Find investment opportunities in Edinburgh under £250k"
# 1. Search all portals
espc = fetch_properties('espc', beds=1)
rightmove = fetch_properties('rightmove', beds=1)
# 2. Filter by criteria
investments = [p for p in all_properties
if p['price'] < 250000
and p['category'] == 'investment']
# 3. Calculate metrics
for prop in investments:
rental_yield = estimate_rental_income(prop) / prop['price']
renovation_cost = estimate_renovation(prop)
prop['roi'] = (rental_yield * 12 - renovation_cost) / prop['price']
# 4. Rank by ROI
top_investments = sorted(investments, key=lambda x: x['roi'], reverse=True)
# 5. Present to user with Block Kit
show_investment_opportunities(top_investments[:5])Agent workflow:
# User: "Find 4-bed homes in Manchester, max £400k"
# 1. Search with criteria
properties = fetch_all_portals(beds=4)
# 2. Apply intelligent filtering
desired_areas = ['Didsbury', 'Chorlton', 'Altrincham', 'Sale']
good_schools = get_school_catchments(properties)
family_homes = [p for p in properties
if p['price'] <= 400000
and any(area in p['address'] for area in desired_areas)
and p['postcode'] in good_schools]
# 3. Deduplicate across portals
unique_homes = deduplicate_by_address(family_homes)
# 4. Score by preferences
for prop in unique_homes:
score = 0
if 'garden' in prop['description'].lower(): score += 10
if 'parking' in prop['description'].lower(): score += 5
if prop['area'] in ['Morningside', 'Bruntsfield']: score += 15
prop['score'] = score
# 5. Show top matches
show_properties(sorted(unique_homes, key=lambda x: x['score'], reverse=True)[:10])Agent workflow:
# Daily monitoring job at 9am
# 1. Fetch today's properties
today = fetch_all_portals(beds=4)
# 2. Load yesterday's snapshot
with open('snapshots/2026-02-15.json') as f:
yesterday = json.load(f)
# 3. Detect price changes
price_drops = []
for t in today:
for y in yesterday:
if same_property(t, y):
if t['price'] < y['price']:
reduction = y['price'] - t['price']
percent = (reduction / y['price']) * 100
price_drops.append({
'property': t,
'old_price': y['price'],
'new_price': t['price'],
'reduction': reduction,
'percent': percent
})
# 4. Alert on significant drops (>5%)
significant = [p for p in price_drops if p['percent'] > 5]
if significant:
send_alert(f"🚨 {len(significant)} price drops detected!")
for prop in significant:
send_property_card(prop)Agent workflow:
# User: "What's the average price for 4-bed homes in Didsbury?"
# 1. Fetch comprehensive data
all_properties = fetch_all_portals(beds=4)
# 2. Filter by area
didsbury = [p for p in all_properties
if 'Didsbury' in p['address']]
# 3. Calculate statistics
prices = [p['price'] for p in didsbury if p['price'] > 0]
analysis = {
'count': len(didsbury),
'average': sum(prices) / len(prices),
'median': sorted(prices)[len(prices)//2],
'min': min(prices),
'max': max(prices),
'per_sqft': average_per_sqft(didsbury)
}
# 4. Compare with other areas
chorlton = calculate_stats([p for p in all_properties if 'Chorlton' in p['address']])
altrincham = calculate_stats([p for p in all_properties if 'Altrincham' in p['address']])
# 5. Present comparison
show_market_analysis({
'Didsbury': analysis,
'Chorlton': chorlton,
'Altrincham': altrincham
})Agent workflow:
# Scheduled: 9am every day
# 1. Fetch latest properties
espc = fetch_properties('espc', beds=4)
rightmove = fetch_properties('rightmove', beds=4)
zoopla = fetch_properties('zoopla', beds=4)
all_properties = espc + rightmove + zoopla
# 2. Deduplicate across portals
unique_properties = deduplicate_by_address(all_properties)
# 3. Filter by user preferences
preferences = load_user_preferences()
matches = [p for p in unique_properties
if p['price'] >= preferences['min_price']
and p['price'] <= preferences['max_price']
and any(area in p['address'] for area in preferences['desired_areas'])
and p['beds'] >= preferences['min_beds']]
# 4. Check for new listings
with open('cache/yesterday.json') as f:
yesterday_ids = {p['id'] for p in json.load(f)}
today_ids = {p['id'] for p in matches}
new_listings = [p for p in matches if p['id'] not in yesterday_ids]
# 5. Rank by score
for prop in new_listings:
prop['score'] = calculate_score(prop, preferences)
top_new = sorted(new_listings, key=lambda x: x['score'], reverse=True)[:10]
# 6. Send daily briefing (format for your platform)
if top_new:
briefing = {
'title': '🏠 Daily Property Briefing',
'stats': {
'new_listings': len(new_listings),
'total_matches': len(matches)
},
'properties': top_new
}
# Format for your platform:
# - Slack: format_as_block_kit(briefing)
# - Telegram: format_as_markdown(briefing)
# - WhatsApp: format_as_rich_text(briefing)
# - SMS: format_as_plain_text(briefing)
send_to_platform(briefing)
# 7. Save today's snapshot
with open('cache/today.json', 'w') as f:
json.dump(matches, f)Agent workflow:
# Monitor for specific triggers
# 1. Price drop alerts
def check_price_drops():
today = fetch_all_saved_properties()
yesterday = load_snapshot('yesterday.json')
for t in today:
for y in yesterday:
if same_property(t, y) and t['price'] < y['price']:
reduction = y['price'] - t['price']
percent = (reduction / y['price']) * 100
if percent >= 5: # Significant drop
alert = {
'type': 'price_drop',
'property': t,
'old_price': y['price'],
'new_price': t['price'],
'reduction': reduction,
'percent': percent
}
# Format for your platform
send_alert(alert)
# 2. New property in desired area
def check_new_premium_properties():
premium_areas = ['Didsbury', 'Chorlton', 'Altrincham']
new_properties = get_new_listings_today()
for prop in new_properties:
if any(area in prop['address'] for area in premium_areas):
alert = {
'type': 'new_premium',
'property': prop,
'area': next(a for a in premium_areas if a in prop['address'])
}
send_alert(alert)
# 3. Price below market average (value alert)
def check_value_opportunities():
new_properties = get_new_listings_today()
market_data = load_market_averages()
for prop in new_properties:
area = prop['area']
avg_price = market_data[area][prop['beds']]['average']
if prop['price'] < avg_price * 0.85: # 15%+ below market
alert = {
'type': 'value_opportunity',
'property': prop,
'market_avg': avg_price,
'saving': avg_price - prop['price'],
'below_market_pct': ((avg_price - prop['price']) / avg_price) * 100
}
send_alert(alert)
# 4. Back on market (was withdrawn, now relisted)
def check_back_on_market():
current = get_all_properties()
last_week = load_snapshot('7_days_ago.json')
yesterday = load_snapshot('yesterday.json')
yesterday_ids = {p['id'] for p in yesterday}
last_week_ids = {p['id'] for p in last_week}
for prop in current:
# Was listed last week, not yesterday, now back
if prop['id'] in last_week_ids and prop['id'] not in yesterday_ids:
alert = {
'type': 'back_on_market',
'property': prop,
'days_withdrawn': 7 # Calculate actual days
}
send_alert(alert)
# Run all checks
check_price_drops()
check_new_premium_properties()
check_value_opportunities()
check_back_on_market()Agent workflow:
# User: "Schedule viewings for my top 3 properties this weekend"
# 1. Get user's saved properties
saved = load_saved_properties()
# 2. Rank by score
top_3 = sorted(saved, key=lambda x: x['score'], reverse=True)[:3]
# 3. Extract agent contact info (from property page)
for prop in top_3:
agent_phone = extract_agent_contact(prop['url'])
agent_email = extract_agent_email(prop['url'])
# 4. Generate viewing request
message = f"""
Hi, I'm interested in viewing {prop['address']}.
Available times:
- Saturday 10am-4pm
- Sunday 10am-4pm
Please let me know available slots.
"""
# 5. Send request (email or call)
if agent_email:
send_email(agent_email, "Viewing Request", message)
else:
add_to_call_list(agent_phone, message)
# 6. Confirm with user
show_viewing_requests_sent(top_3)| Portal | Coverage | Listings | Status | Dependencies |
|---|---|---|---|---|
| Rightmove | UK-wide | 1.5M | ✅ Working | None |
| Zoopla | UK-wide | 750k | ✅ Working | Firecrawl |
| ESPC | Scotland | 2-3k | ✅ Working | None |
- 🇬🇧 UK: 95%+ coverage (Rightmove + Zoopla)
- 🏴 Scotland: 99% coverage (+ ESPC for Edinburgh)
- 📈 Market share: 80% Rightmove + 50% Zoopla across UK
The CLI provides property data. Your agent makes intelligent decisions.
# Agent logic (not CLI) - customize for your target city
area_tiers = {
'premium': ['Didsbury', 'Chorlton', 'Altrincham', 'Hale'], # Manchester example
'excellent': ['Sale', 'Timperley', 'Cheadle'],
'good': ['Stretford', 'Withington', 'Levenshulme'],
'avoid': ['Moss Side', 'Longsight'] # Configure per city
}
def score_by_area(property):
address = property['address'].lower()
for area in area_tiers['premium']:
if area.lower() in address:
return 100 # Premium location
for area in area_tiers['excellent']:
if area.lower() in address:
return 80
for area in area_tiers['good']:
if area.lower() in address:
return 60
for area in area_tiers['avoid']:
if area.lower() in address:
return 0 # Auto-reject
return 40 # Unknown area - proceed with caution# Calculate rental yield
def calculate_rental_yield(property, market='uk_average'):
"""
UK rental market averages (adjust for your region):
- 1 bed: £700-1000/month
- 2 bed: £900-1400/month
- 3 bed: £1200-1800/month
- 4 bed: £1500-2500/month
London: multiply by 1.8
Manchester/Edinburgh: multiply by 1.2
Regional: multiply by 0.8
"""
monthly_rent = {
1: 850, 2: 1100, 3: 1400, 4: 1800
}.get(property['beds'], 1000)
annual_rent = monthly_rent * 12
yield_percent = (annual_rent / property['price']) * 100
return {
'monthly_rent': monthly_rent,
'annual_income': annual_rent,
'yield_percent': yield_percent,
'rating': 'excellent' if yield_percent > 6 else 'good' if yield_percent > 4 else 'poor'
}
# Example usage
property = {
'price': 200000,
'beds': 2,
'address': '...'
}
metrics = calculate_rental_yield(property)
# Output: {'monthly_rent': 1200, 'annual_income': 14400, 'yield_percent': 7.2, 'rating': 'excellent'}# Quick implementation - see Use Cases for complete version
from difflib import SequenceMatcher
def deduplicate(properties, threshold=0.85):
"""
Match properties across portals by address similarity.
Args:
properties: List of property dicts
threshold: Similarity score 0-1 (0.85 = 85% match)
Returns:
List of unique properties
"""
def normalize(addr):
return ' '.join(addr.lower()
.replace(',', '')
.replace('.', '')
.split())
unique = []
seen = []
for prop in properties:
addr = normalize(prop['address'])
# Check if similar to any seen address
is_duplicate = any(
SequenceMatcher(None, addr, seen_addr).ratio() >= threshold
for seen_addr in seen
)
if not is_duplicate:
unique.append(prop)
seen.append(addr)
return unique
# Usage
all_properties = fetch_all_portals(beds=4) # 57 properties
unique_properties = deduplicate(all_properties) # 38 unique
# Deduplication typically reduces by 30-40%Advanced: See Deduplication use case for merging data from multiple portals (take best price, combine images, etc).
# Score properties by value for money
def analyze_value(property, market_data):
"""
Compare property price to area average.
Find undervalued properties.
"""
# Get area average price per bedroom
area = property['area']
beds = property['beds']
avg_price_per_bed = market_data[area][beds]['average'] / beds
this_price_per_bed = property['price'] / beds
value_ratio = this_price_per_bed / avg_price_per_bed
if value_ratio < 0.85:
return {'rating': 'excellent', 'reason': 'Below market average by 15%+'}
elif value_ratio < 0.95:
return {'rating': 'good', 'reason': 'Slightly below market average'}
elif value_ratio < 1.05:
return {'rating': 'fair', 'reason': 'At market average'}
else:
return {'rating': 'poor', 'reason': 'Above market average'}All parsers return normalized JSON:
{
"portal": "espc",
"fetched_at": "2026-02-16T08:30:00Z",
"count": 8,
"properties": [
{
"id": "36362802",
"portal": "espc",
"title": "4-bed house",
"price": 450000,
"price_text": "Offers Over £450,000",
"beds": 4,
"baths": 2,
"property_type": "house",
"address": "1 Buckstone Circle, Edinburgh EH10",
"area": "Buckstone",
"postcode": "EH10 6XB",
"url": "https://espc.com/property/...",
"image_url": "https://images.espc.com/...",
"category": "family"
}
]
}Trigger this CLI when you need to:
- 🏡 Search for properties in Edinburgh or UK
- 💰 Find investment opportunities
- 👨👩👧👦 Look for family homes (4+ beds)
- 📊 Analyze property market data
- 🔔 Build property alert systems
- 🤖 Integrate with AI agents (Slack bots, etc.)
python3 setup.pyGuides you through configuring:
- Search criteria — Bedrooms, price range, property types
- Area preferences — Desired postcodes, areas to avoid, premium areas
- Scoring weights — How to rank properties
- Deduplication — Automatic duplicate removal (recommended: enabled)
- Daily briefings — Schedule and preferences
Presets available:
- Edinburgh (EH10, EH12, EH4, etc.)
- Manchester (M20, M21, Didsbury, etc.)
- London (SW, W, NW postcodes)
- Custom (configure manually)
Saves to preferences.json — edit manually anytime.
{
"search": {
"min_beds": 4,
"max_price": 600000
},
"areas": {
"desired": ["EH10", "EH12", "EH4"],
"excluded": ["EH17", "Niddrie", "Moredun"],
"premium": ["EH10", "EH9"]
},
"deduplication": {
"enabled": true,
"threshold": 0.85
}
}Built-in and automatic. You don't need to call it separately.
When you run briefing.py:
- Fetches from all portals (57 properties)
- Automatically deduplicates (→ 38 unique)
- Filters by your preferences
- Ranks and outputs
Why automatic?
- Properties appear on multiple portals (Rightmove + Zoopla + ESPC)
- Typical reduction: 30-40% duplicates
- Merges best data from each portal (lowest price, all images, all URLs)
Manual control:
# Standalone deduplication
python3 dedupe.py espc.json rightmove.json zoopla.json
# Disable in preferences
{"deduplication": {"enabled": false}}The CLI handles data. Your agent handles intelligence and presentation.
# CLI provides
python3 parsers/rightmove.py 4 # → Raw property JSON
# Your agent builds
- Daily briefings (fetch → filter → rank → format → send)
- Price alerts (compare → detect → notify)
- Deduplication (match → merge)
- Market analysis (aggregate → calculate)
# Your agent formats for your platform
- Slack: Block Kit with images and buttons
- Telegram: Markdown with inline keyboards
- WhatsApp: Rich text with media
- iMessage/SMS: Plain text with links
- Discord: Embeds with thumbnailsSee Use Cases below for complete implementation examples of what your agent can build with this data.
Properties are auto-categorized:
| Category | Criteria | Use Case |
|---|---|---|
| Investment | Price < £250k | High rental yield, renovation opportunities |
| Family | 4+ bedrooms | Growing families, good areas |
| Other | Everything else | General market |
Configure area filtering based on your target city/region. The CLI provides raw data — your agent applies local knowledge.
Premium: Morningside • Bruntsfield • Marchmont • Stockbridge
Good: Colinton • Cramond • Portobello • Leith
Avoid: Moredun • Niddrie • Wester Hailes • Sighthill
Premium: Didsbury • Chorlton • Altrincham • Hale
Good: Sale • Timperley • Stretford
Avoid: Configure based on local crime/school data
Premium: Kensington • Chelsea • Hampstead • Richmond
Good: Clapham • Islington • Greenwich
Avoid: Configure based on local knowledge
Customize area filters in your agent code — not in the CLI.
| Document | Description |
|---|---|
| README.md | Main documentation (you are here) |
| PORTALS.md | Portal analysis & rankings |
| SKILL.md | AI agent integration guide |
# Get property count
python3 parsers/espc.py 4 | jq '.count'
# Get all addresses
python3 parsers/espc.py 4 | jq -r '.properties[].address'
# Properties under £300k
python3 parsers/espc.py 4 | jq '.properties[] | select(.price < 300000)'
# Family homes only
python3 parsers/espc.py 4 | jq '.properties[] | select(.category == "family")'#!/bin/bash
CACHE_FILE="cache/espc-4bed.json"
CACHE_TTL=3600 # 1 hour
if [ -f "$CACHE_FILE" ]; then
AGE=$(( $(date +%s) - $(stat -f %m "$CACHE_FILE") ))
if [ $AGE -lt $CACHE_TTL ]; then
cat "$CACHE_FILE"
exit 0
fi
fi
# Fetch fresh
python3 parsers/espc.py 4 | tee "$CACHE_FILE"# Daily briefing at 9am
python3 parsers/espc.py 4 > /tmp/espc.json
python3 parsers/rightmove.py 4 > /tmp/rightmove.json
# Format as Block Kit
node format-briefing.js /tmp/*.json | slack-postconst properties = await bash(`python3 ${skillDir}/parsers/espc.py 4`);
const data = JSON.parse(properties.stdout);
data.properties.forEach(p => {
console.log(`${p.address} - £${p.price.toLocaleString()}`);
});tools: [
"property_search_espc",
"property_search_rightmove",
"property_search_zoopla",
"property_track_price"
]| Parser | Dependencies | Cost/Month | Status |
|---|---|---|---|
| ESPC | None (curl) | £0 | ✅ Free forever |
| Rightmove | None (curl) | £0 | ✅ Free forever |
| Zoopla | Firecrawl API | ~$1 |
Daily briefing cost: ~$0.03/day = ~$1/month = ~$12/year
uk-property-cli/
├── parsers/
│ ├── espc.py # ESPC parser (Edinburgh) ✅
│ ├── rightmove.py # Rightmove parser (UK) ✅
│ └── zoopla.py # Zoopla parser (UK) ✅
├── fetch.sh # Portal dispatcher
├── cache/ # Cache directory
├── README.md # Main documentation
├── PORTALS.md # Portal analysis
└── SKILL.md # Integration guide
| Parser | Properties | Price Range | Dependencies |
|---|---|---|---|
| ESPC | 8 found | £450k - £895k | None |
| Rightmove | 25 found | £290k - £769k | None |
| Zoopla | 24 found | £260k - £1.2M | Firecrawl |
Total: 57 properties from one search
- UK: 95%+ (Rightmove + Zoopla)
- Scotland: 99% (+ ESPC)
- England/Wales: 95% (Rightmove + Zoopla)
- ESPC: ~2s
- Rightmove: ~3s
- Zoopla: ~5s (API call)
- ESPC parser (Edinburgh)
- Rightmove parser (UK)
- Zoopla parser (UK)
- Normalized JSON output
- Area filtering
- Category tagging
- OnTheMarket parser (exclusive listings)
- Price tracking database
- Alert system for new listings
- Daily briefing formatter
- Investment metrics (yield, ROI)
- S1 Homes (Scotland-wide)
- ASPC (Aberdeen)
- Sold price analysis
- Block Kit template library
- MCP server wrapper
- npm/pip package
This is a private tool, but the architecture is simple:
- Create
parsers/newportal.py - Implement
fetch_page()andparse_properties() - Return normalized JSON format
- Done!
See PORTALS.md for portal analysis.
Private - for personal use
PiBooks Property - Vertical AI agent for property search
Target market:
- Property investors (£5-100M portfolios)
- Estate agents (market intelligence)
- Property developers (opportunity scanning)
UK PropTech market: £8.4bn
See PORTALS.md for full commercial analysis.
Built with zero dependencies for maximum portability and zero costs.
Inspired by the UK property market's complexity and the need for intelligent property search.