Skip to content

Releases: Capstone-Projects-2024-Spring/project-intelligest-smart-home

v2.0.0

06 May 03:44
94a276a

Choose a tag to compare

IntelliGest Smart Home - v2.0.0 - Release notes

What's Changed

  • New Weather Pop Up
  • Improved User Interface
  • Updated Acceptance Tests
  • Updated Class Diagrams
  • Updated API Specifications
  • Updated Documentations

v1.2.0

30 Apr 02:20
ba12d10

Choose a tag to compare

Release notes - IntelliGest Smart Home - v1.2.0

What's Changed

Full Changelog: 1.1.0...1.2.0

Story

ISH-18 Locked the door

ISH-31 Unlocked the door

ISH-61 Script Loading Models onto TPU

ISH-98 Configure the lock

ISH-102 Display available entities for chosen device

ISH-104 Add detection criteria for other gestures

ISH-108 Add pop up to react page for adding devices and viewing currently connected devices

ISH-112 Make weather look pretty

ISH-114 Convert number of gestures used in total to 2. Try to make them more intuitive

ISH-115 Create popup in react for news

ISH-117 Add documentation to python methods files

v1.1.0

24 Apr 15:24
7bc9b25

Choose a tag to compare

Release notes - IntelliGest Smart Home - v1.1.0

Story

ISH-2 Prepare Device for moving out of Office

ISH-93 Signal to the user that the gesture was selected/detected

ISH-94 Make React page work with Flask Server

ISH-95 Add a function to list of available devices

ISH-96 Return News Stories

ISH-97 Add video feed to React page

ISH-99 configure second light

ISH-100 Signal to the user the action with the device chosen

ISH-103 Add a "restart sequence" gesture

ISH-105 Create class for video detection

ISH-107 Add intelligest smart home logo/header to react page

ISH-109 Add react feature where buttons will do their task when clicked on

ISH-110 Add tests for react page

ISH-111 Add weather to react and flask

v1.0.1 - Version 1.0.0 Hotfix

09 Apr 22:08
d869f42

Choose a tag to compare

Release notes - IntelliGest Smart Home - 1.0.1

Story

ISH-13 Turned on the lights

ISH-24 Turned off lights

ISH-34 Removing old device connections

ISH-75 Research Light Controlling through HA and MQTT

ISH-78 Research Hand Gesture Recognition Repository

ISH-80 Design Document Architecture Feedback

ISH-82 API Doc feedback 1

ISH-83 Build basic GUI

ISH-84 Connect weather to GUI

ISH-86 Connect light to pyqt GUI

ISH-87 Configure Lights microcontroller

ISH-88 Breadboard the light

ISH-89 Configure home assistant on the Pi

ISH-90 Figma Mockup GUI

What's New

Bundled in this release is the introduction of controlling a light based on camera feed that detects motion and then searched for the correct gesture. You can start the project, open the webpage and view a live feed of yourself while you gesture a thumbs up (👍) to control the light within home assistant.

How to Run the Project

To run this project, you will most likely need to be in the Professor's office, as our demo will be using our Raspberry Pi device.

  1. The first step of running our project is to turn on and connect the Raspberry Pi to the wifi, screen, camera, and light. If you're using our setup, you only need to connect to the wifi, and connect the light to any power source through a micro-USB. Follow the steps outlined here to connect
  2. You can pull the project right off the github and clone it.
  3. Once cloned, enter the root of the project and create a virtual environment using python -m venv ./venv/
  4. In the root directory, enable the virtual environment with source venv/bin/activate
  5. In the Models directory, install the requirements with pip install -r requirements.txt. You may need to install additional requirements that don't install due to unknown issues with the requirments file, manually install them with pip install opencv-python, pip install flask, pip install requests, pip install mediapipe
  6. Once the Raspberry Pi is connected and configured, navigate to the app subfolder of the Project-Intelligest-Smart-Home repository.
    This can be done using the cd command if you are using the terminal and have opened the repository
  7. Once in the app subfolder, run the command python -m flask run to start the program. The program should take a few seconds to start up.
  8. Once the program has output the link to the flask server, open the link in a web browser and enter full-screen mode.
  9. The web page should display a live feed from the camera, and the program should be running.
  10. To test the program, make a thumbs up, thumbs down, and thumbs flat gesture to make sure the model is correctly detecting gestures.
  11. After this test, hold your index finger up for the camera until the gesture is detected as your first gesture.
  12. Next, hold a thumbs-up gesture in front of the camera. After this gesture has been held for a few seconds, the web page should show the gestures you used.
  13. Once these gestures are detected, the light should turn on through the home assistant API.

Connecting the Pi and the Light to the Same Network

With Home Assistant, any devices in the system must be connected to the same network as the device running Home Assistant. For demo and testing purposes, you will need to host this network yourself.
We recommend utilizing a laptop to provide "shared internet". Both macOS and Windows have this feature.

For minimal setup, configure the network as follows:

  • SSID (network name): "showthesign"
  • Password: "f334p8ofpehgb"
  • Channel: 1, 6, or 11 (if required, it MUST be one of these channels since the light communicates over 2.4GHz band)
  • WPA2/WPA3 Security

With this configuration, the Raspberry Pi already recognizes this connection and so do the ESPs running the light. To configure your own network with your own name and password, first edit livingroom.yaml to include your chosen ssid and password and then you will need to reflash the ESP by connecting it to your laptop and running esphome run livingroom.yaml through the terminal. We highly recommend just setting up the internet share with the defaults we provide

Once both the ESP and the main device are connected to the same network, you are good to start using the project, following the steps outlined above!

v1.0.0

09 Apr 02:50
c9a0469

Choose a tag to compare

Release notes - IntelliGest Smart Home - 1.0.0

Story

ISH-13 Turned on the lights

ISH-24 Turned off lights

ISH-34 Removing old device connections

ISH-75 Research Light Controlling through HA and MQTT

ISH-78 Research Hand Gesture Recognition Repository

ISH-80 Design Document Architecture Feedback

ISH-82 API Doc feedback 1

ISH-83 Build basic GUI

ISH-84 Connect weather to GUI

ISH-86 Connect light to pyqt GUI

ISH-87 Configure Lights microcontroller

ISH-88 Breadboard the light

ISH-89 Configure home assistant on the Pi

ISH-90 Figma Mockup GUI

How to Run the Project

To run this project, you will most likely need to be in the Professor's office, as our demo will be using our Raspberry Pi device.

  1. The first step of running our project is to turn on and connect the Raspberry Pi to the wifi, screen, camera, and light. If you're using our setup, you only need to connect to the wifi, and connect the light to any power source through a micro-USB. Follow the steps outlined here to connect
  2. Once the Raspberry Pi is connected and configured, navigate to the app subfolder of the Project-Intelligest-Smart-Home repository.
    This can be done using the cd command if you are using the terminal and have opened the repository
  3. Once in the app subfolder, run the command python -m flask run to start the program. The program should take a few seconds to start up.
  4. Once the program has output the link to the flask server, open the link in a web browser.
  5. The web page should display a live feed from the camera, and the program should be running.
  6. To test the program, make a thumbs up, thumbs down, and thumbs flat gesture to make sure the model is correctly detecting gestures.
  7. After this test, hold your index finger up for the camera until the gesture is detected as your first gesture.
  8. Next, hold a thumbs-up gesture in front of the camera. After this gesture has been held for a few seconds, the web page should show the gestures you used.
  9. Once these gestures are detected, the light should turn on through the home assistant API.

Connecting the Pi and the Light to the Same Network

With Home Assistant, any devices in the system must be connected to the same network as the device running Home Assistant. For demo and testing purposes, you will need to host this network yourself.
We recommend utilizing a laptop to provide "shared internet". Both macOS and Windows have this feature.

For minimal setup, configure the network as follows:

  • SSID (network name): "showthesign"
  • Password: "f334p8ofpehgb"
  • Channel: 1, 6, or 11 (if required, it MUST be one of these channels since the light communicates over 2.4GHz band)
  • WPA2/WPA3 Security

With this configuration, the Raspberry Pi already recognizes this connection and so do the ESPs running the light. To configure your own network with your own name and password, first edit livingroom.yaml to include your chosen ssid and password and then you will need to reflash the ESP by connecting it to your laptop and running esphome run livingroom.yaml through the terminal. We highly recommend just setting up the internet share with the defaults we provide

Once both the ESP and the main device are connected to the same network, you are good to start using the project, following the steps outlined above!