Automate scraping property listings from a Zillow clone website and submit the data into a Google Form using Selenium and BeautifulSoup. The submitted responses can then be collected and managed easily in a connected Google Sheets spreadsheet.
This project automatically scrapes property listings from the Zillow-Clone website and submits the extracted address, price, and listing link data into a Google Form. The form responses are then collected and managed in a linked Google Sheets spreadsheet for easy analysis.
- Web Scraping: Extract data using BeautifulSoup and requests
- Data Cleaning: Format price data to a consistent structure
- Automation: Use Selenium to automatically fill and submit Google Forms
- Google Sheets Integration: Responses are automatically collected in Google Sheets
- Centralized Data Management: Enables easy reporting and analysis
- Make sure Python 3.x is installed
- Install required packages:
- selenium
- beautifulsoup4
- requests
- Download ChromeDriver
- Run the script
- Run the script:
- It scrapes listings from Zillow-Clone
- Opens the Google Form
- Automatically fills and submits the form for each listing
- Responses are collected in Google Sheets linked to the form
- If the structure of the Google Form or Zillow-Clone website changes, XPath selectors and scraping logic may need updates
- Be mindful of request intervals to avoid overloading the target site
- Replace the Google Form URL with your own formβs URL before running(Because the link in the code won't be exist)
- Expand scraping to include more fields
- Add error handling and logging
- Adapt the scraper for other real estate websites
- Write data directly to Google Sheets using Google Sheets API (bypassing forms)
- Create a user-friendly interface for easier use
- MIT License - Feel free to use, modify, and distribute as you like.
- Feel free to reach out if you have any questions or contributions!
- [email protected]
Thank you!