From cb21c1daec75c23048802443d23542c82dbc4cc4 Mon Sep 17 00:00:00 2001 From: Deepjyoti Paul Date: Tue, 5 Aug 2025 12:00:50 +0530 Subject: [PATCH 1/2] Create project Greenify in Showcase for Cookbook --- docs/showcase/greenify.mdx | 204 +++++++++++++++++++++++++++++++++++++ 1 file changed, 204 insertions(+) create mode 100644 docs/showcase/greenify.mdx diff --git a/docs/showcase/greenify.mdx b/docs/showcase/greenify.mdx new file mode 100644 index 0000000..32293ab --- /dev/null +++ b/docs/showcase/greenify.mdx @@ -0,0 +1,204 @@ +--- +title: Greenify | Localized community-driven greenification/plantation solution with AI +description: Greenify is a mobile application designed to encourage and facilitate sustainable practices by analyzing live image and building community via Perplexity Sonar API. +sidebar_position: 6 +keywords: [image processing, community, maps, expo, react native, flask, Perplexity, sonar] +--- + +# Greenify + +![Greenify](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/master/assets/images/gallery.jpg) + +The inspiration for Greenify stems from the growing need to address environmental challenges and promote sustainable living. With the rise of urbanization and technology, we wanted to create a platform that merges innovation with eco-consciousness. Greenify aims to empower individuals and communities to take actionable steps toward a greener future by leveraging technology to make sustainability accessible and engaging. + +## Features + +Greenify is a mobile application designed to encourage and facilitate sustainable practices. It provides users with tools and resources to: + +- Participate in community-driven eco-friendly challenges and initiatives. +- Access a curated library of tips, guides, and resources for sustainable living. +- Connect with like-minded individuals through a community platform to share ideas and inspire action. + +The app is designed to be user-friendly, visually appealing, and impactful, making it easier for users to integrate sustainability into their daily lives. + +## Prerequisites + +- NodeJS 20.19.4 or later +- Python 3.10.0 or later +- Perplexity API key for Sonar integration +- Expo (SDK version 51 or later) ([Setup guide](https://docs.expo.dev/)) +- Android SDK/studio set up for local build or simulator run ([Setup guide](https://developer.android.com/about/versions/14/setup-sdk)) +- Xcode installed if using Mac and for simulator run ([Setup guide](https://developer.apple.com/documentation/safari-developer-tools/installing-xcode-and-simulators)) +- An Android/iPhone device for image capture with camera + +## Installation + +This is an [Expo](https://expo.dev) project created with [`create-expo-app`](https://www.npmjs.com/package/create-expo-app). Also there is a ```/service``` folder in the root directory of the project which contains the Flask API for communicating between frontend and Perplexity API. + +1. Install dependencies + + ```bash + npm install + ``` + +2. Start the app + + ```bash + npx expo start + ``` + +In the output, you'll find options to open the app in a + +- [development build](https://docs.expo.dev/develop/development-builds/introduction/) +- [Android emulator](https://docs.expo.dev/workflow/android-studio-emulator/) +- [iOS simulator](https://docs.expo.dev/workflow/ios-simulator/) +- [Expo Go](https://expo.dev/go), a limited sandbox for trying out app development with Expo + +3. In another terminal navigate to ```/service``` folder and install dependencies +```bash +pip install -r requirements.txt +``` +4. Set ```PPLX_API_KEY``` in ```.env``` file inside the ```/service``` folder (create ```.env``` file if doesn't exist) +4. Run Flask app +```bash +python app.py +``` +5. To open app in mobile +##### Option 1 +* Install Expo Go app from Play Store or App Store +* Scan the QR code shown in the terminal + +##### Option 2 +Open web browser in your smartphone and navigate to the URL shown in the console. + +## Abstract Data Flow Diagram +![Abstract Data Flow Diagram](https://d112y698adiu2z.cloudfront.net/photos/production/software_photos/003/418/290/datas/gallery.jpg) + + +## Usage + +After running the app in your own setup or through the hosted URL, the following steps can be followed: + +### First Step: Greenification + +| | | +|------|-------------| +| ![First Step](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/refs/heads/master/assets/images/ezgif-4647c5467e6dac.gif) | | + +### Second Step: Community Building + +| | | +|-------------|------| +| | ![Second Step](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/refs/heads/master/assets/images/ezgif-439e19e10fb6f4.gif) | + +## Code Explanation + +Greenify was built using the [Expo](https://expo.dev) framework, which allowed us to create a cross-platform application for Android, iOS, and the web. Key technologies and tools used include: + +- **Frontend**: React Native with Expo for building the user interface and ensuring a seamless user experience. +- **Backend**: A Python-based service using Flask to handle data processing and API endpoints. +- **Perplexity AI**: Using Perplexity AI's sonar-pro and sonar-deep-research models to classify image, plant suggestions based on image and coordinates by realtime research, creating a community by matching users of similar plant suggestions +- **Design**: Using React Native UI Kitten for custom themes and assets, including fonts and icons, to create a visually cohesive and engaging interface. +- **File-based routing**: Leveraging Expo's file-based routing system for intuitive navigation. +- **Community features**: Implemented using React Native components and hooks for real-time interaction. + +** Pydantic models ** +```python +from pydantic import BaseModel, Field + + +class Plant(BaseModel): + name: str + image: str = Field(description="Image URL of the plant") + description: str = Field(description="Description of the plant") + care_instructions: str = Field(description="Care instructions for the plant") + care_tips: str = Field(description="Care tips for the plant") + AR_model: str = Field(description="AR model URL for the plant") + + +class Answer1(BaseModel): + description: str + +class Answer2(BaseModel): + plants: list[Plant] + + +class Benefit(BaseModel): + type: str = Field(description="Type of the environmental benefit") + amount: str = Field(description="How much percentage of improvement") + direction: bool = Field(description="True means increasing, False means decreasing") + + +class Group(BaseModel): + users: list[str] = Field( + description="List at least 2 or more users with similar plant suggestions and how they can combine same job in term of place, activities and plantation" + ) + description: list[str] = Field( + description="Short description of how these people match with each other" + ) + benefits: list[Benefit] = Field( + description="How this combination helps benefit the environment with parameter, percentage value" + ) + + +class Community(BaseModel): + match: list[Group] +``` + +** Image Analysis and Insights about the captured image using sonar-pro and structured JSON output ** +```python +payload = { + "model": "sonar-pro", + "messages": [ + { + "role": "user", + "content": [ + { + "type": "text", + "text": f"Analyze this image and return short description of the place with respect to suitability of plant growth ", + }, + {"type": "image_url", "image_url": {"url": image}}, + ], + }, + ], + "stream": False, + "response_format": { + "type": "json_schema", + "json_schema": {"schema": Answer1.model_json_schema()}, + }, +} +``` + +** Matching people with similar plant suggestions and interests and create a matching community using sonar-deep-research ** +```python +payload_research = { + "model": "sonar-deep-research", + "messages": [ + { + "role": "system", + "content": "You are a plant growth expert. You are given a description of a place where an user want to grow some plants. You are also given latitude, longitude and altitude of the user. Your task is to suggest at most 5 plant that can be grown by the user in that particular place according to average weather.", + }, + { + "role": "user", + "content": f"I am standing in a place having coordinates [{lat}, {lng}] and altitude {alt}]. The place can be described as follows: {answer1}" + "Suggest at most five suitable plants that can be grown here.", + }, + ], + "stream": False, + "response_format": { + "type": "json_schema", + "json_schema": {"schema": Answer2.model_json_schema()}, + }, +} +``` + + +## Links + +- [GitHub Repository](https://github.com/deepjyotipaulhere/greenify) +- [Live Demo](https://greenify.expo.app) + +![Live Demo URL](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/refs/heads/master/assets/images/frame.png) +- Youtube Demo + +[![IMAGE ALT TEXT HERE](https://img.youtube.com/vi/IFP0EiHqd7Y/0.jpg)](https://www.youtube.com/watch?v=IFP0EiHqd7Y) \ No newline at end of file From 8bbf1c79f790e6ae3fc1657da923127e6e606179 Mon Sep 17 00:00:00 2001 From: Kesku Date: Sat, 16 Aug 2025 20:50:19 +0100 Subject: [PATCH 2/2] Unify Greenify showcase page structure to match standard format --- docs/showcase/greenify.mdx | 234 +++++++++++-------------------------- 1 file changed, 66 insertions(+), 168 deletions(-) diff --git a/docs/showcase/greenify.mdx b/docs/showcase/greenify.mdx index 32293ab..5371ac7 100644 --- a/docs/showcase/greenify.mdx +++ b/docs/showcase/greenify.mdx @@ -1,204 +1,102 @@ --- title: Greenify | Localized community-driven greenification/plantation solution with AI -description: Greenify is a mobile application designed to encourage and facilitate sustainable practices by analyzing live image and building community via Perplexity Sonar API. -sidebar_position: 6 -keywords: [image processing, community, maps, expo, react native, flask, Perplexity, sonar] +description: A mobile application that analyzes photos and location data to suggest suitable plants and build sustainable communities using Perplexity Sonar API +sidebar_position: 19 +keywords: [greenify, plant recommendation, community, expo, react native, flask, perplexity, sonar, image analysis, sustainability] --- -# Greenify -![Greenify](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/master/assets/images/gallery.jpg) +**Greenify** is a mobile application designed to encourage sustainable practices by analyzing live images and building communities. Users capture photos of their space (balcony, roadside, basement, etc.) and Greenify automatically analyzes the image using Perplexity's Sonar API to suggest suitable plants for that location. The app also connects like-minded people in the locality to create communities for sustainable, economic, and social growth. -The inspiration for Greenify stems from the growing need to address environmental challenges and promote sustainable living. With the rise of urbanization and technology, we wanted to create a platform that merges innovation with eco-consciousness. Greenify aims to empower individuals and communities to take actionable steps toward a greener future by leveraging technology to make sustainability accessible and engaging. + ## Features -Greenify is a mobile application designed to encourage and facilitate sustainable practices. It provides users with tools and resources to: +* **AI-Powered Plant Analysis** using image recognition and location data to suggest suitable plants +* **Location-Based Recommendations** considering weather, sunlight, and environmental conditions +* **Community Building** connecting users with similar plant interests and sustainable goals +* **Cross-Platform Mobile App** built with Expo for iOS, Android, and web +* **Real-time Weather Integration** for accurate plant suitability assessment +* **Structured JSON Output** using Pydantic models for consistent data handling +* **AR Model Support** for enhanced plant visualization -- Participate in community-driven eco-friendly challenges and initiatives. -- Access a curated library of tips, guides, and resources for sustainable living. -- Connect with like-minded individuals through a community platform to share ideas and inspire action. - -The app is designed to be user-friendly, visually appealing, and impactful, making it easier for users to integrate sustainability into their daily lives. +## Abstract Data Flow Diagram +![Abstract Data Flow Diagram](https://d112y698adiu2z.cloudfront.net/photos/production/software_photos/003/418/290/datas/gallery.jpg) ## Prerequisites -- NodeJS 20.19.4 or later -- Python 3.10.0 or later -- Perplexity API key for Sonar integration -- Expo (SDK version 51 or later) ([Setup guide](https://docs.expo.dev/)) -- Android SDK/studio set up for local build or simulator run ([Setup guide](https://developer.android.com/about/versions/14/setup-sdk)) -- Xcode installed if using Mac and for simulator run ([Setup guide](https://developer.apple.com/documentation/safari-developer-tools/installing-xcode-and-simulators)) -- An Android/iPhone device for image capture with camera +* Node.js 20.19.4+ and npm +* Python 3.10.0+ and pip +* Expo CLI and SDK 51+ +* Perplexity API key (Sonar Pro and Sonar Deep Research) +* Android SDK/Studio or Xcode (for local builds) +* Mobile device with camera for image capture ## Installation -This is an [Expo](https://expo.dev) project created with [`create-expo-app`](https://www.npmjs.com/package/create-expo-app). Also there is a ```/service``` folder in the root directory of the project which contains the Flask API for communicating between frontend and Perplexity API. - -1. Install dependencies - - ```bash - npm install - ``` - -2. Start the app - - ```bash - npx expo start - ``` - -In the output, you'll find options to open the app in a +```bash +# Clone the repository +git clone https://github.com/deepjyotipaulhere/greenify.git +cd greenify -- [development build](https://docs.expo.dev/develop/development-builds/introduction/) -- [Android emulator](https://docs.expo.dev/workflow/android-studio-emulator/) -- [iOS simulator](https://docs.expo.dev/workflow/ios-simulator/) -- [Expo Go](https://expo.dev/go), a limited sandbox for trying out app development with Expo +# Install frontend dependencies +npm install -3. In another terminal navigate to ```/service``` folder and install dependencies -```bash +# Install backend dependencies +cd service pip install -r requirements.txt ``` -4. Set ```PPLX_API_KEY``` in ```.env``` file inside the ```/service``` folder (create ```.env``` file if doesn't exist) -4. Run Flask app -```bash -python app.py -``` -5. To open app in mobile -##### Option 1 -* Install Expo Go app from Play Store or App Store -* Scan the QR code shown in the terminal - -##### Option 2 -Open web browser in your smartphone and navigate to the URL shown in the console. -## Abstract Data Flow Diagram -![Abstract Data Flow Diagram](https://d112y698adiu2z.cloudfront.net/photos/production/software_photos/003/418/290/datas/gallery.jpg) +## Configuration +Create `.env` file in the `service` directory: +```ini +PPLX_API_KEY=your_perplexity_api_key_here +``` ## Usage -After running the app in your own setup or through the hosted URL, the following steps can be followed: +1. **Start Backend Service**: + ```bash + cd service + python app.py + ``` -### First Step: Greenification +2. **Start Frontend App**: + ```bash + npx expo start + ``` -| | | -|------|-------------| -| ![First Step](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/refs/heads/master/assets/images/ezgif-4647c5467e6dac.gif) | | +3. **Access the App**: + - Install Expo Go app and scan QR code, or + - Open web browser on mobile and navigate to the URL shown -### Second Step: Community Building +4. **Use the App**: + - Grant camera and location permissions + - Take a photo of your space (balcony, garden, etc.) + - Receive AI-powered plant recommendations + - Connect with nearby users for community building -| | | -|-------------|------| -| | ![Second Step](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/refs/heads/master/assets/images/ezgif-439e19e10fb6f4.gif) | +![demo](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/refs/heads/master/assets/images/ezgif-4647c5467e6dac.gif) | ## Code Explanation -Greenify was built using the [Expo](https://expo.dev) framework, which allowed us to create a cross-platform application for Android, iOS, and the web. Key technologies and tools used include: - -- **Frontend**: React Native with Expo for building the user interface and ensuring a seamless user experience. -- **Backend**: A Python-based service using Flask to handle data processing and API endpoints. -- **Perplexity AI**: Using Perplexity AI's sonar-pro and sonar-deep-research models to classify image, plant suggestions based on image and coordinates by realtime research, creating a community by matching users of similar plant suggestions -- **Design**: Using React Native UI Kitten for custom themes and assets, including fonts and icons, to create a visually cohesive and engaging interface. -- **File-based routing**: Leveraging Expo's file-based routing system for intuitive navigation. -- **Community features**: Implemented using React Native components and hooks for real-time interaction. - -** Pydantic models ** -```python -from pydantic import BaseModel, Field - - -class Plant(BaseModel): - name: str - image: str = Field(description="Image URL of the plant") - description: str = Field(description="Description of the plant") - care_instructions: str = Field(description="Care instructions for the plant") - care_tips: str = Field(description="Care tips for the plant") - AR_model: str = Field(description="AR model URL for the plant") - - -class Answer1(BaseModel): - description: str - -class Answer2(BaseModel): - plants: list[Plant] - - -class Benefit(BaseModel): - type: str = Field(description="Type of the environmental benefit") - amount: str = Field(description="How much percentage of improvement") - direction: bool = Field(description="True means increasing, False means decreasing") - - -class Group(BaseModel): - users: list[str] = Field( - description="List at least 2 or more users with similar plant suggestions and how they can combine same job in term of place, activities and plantation" - ) - description: list[str] = Field( - description="Short description of how these people match with each other" - ) - benefits: list[Benefit] = Field( - description="How this combination helps benefit the environment with parameter, percentage value" - ) - - -class Community(BaseModel): - match: list[Group] -``` - -** Image Analysis and Insights about the captured image using sonar-pro and structured JSON output ** -```python -payload = { - "model": "sonar-pro", - "messages": [ - { - "role": "user", - "content": [ - { - "type": "text", - "text": f"Analyze this image and return short description of the place with respect to suitability of plant growth ", - }, - {"type": "image_url", "image_url": {"url": image}}, - ], - }, - ], - "stream": False, - "response_format": { - "type": "json_schema", - "json_schema": {"schema": Answer1.model_json_schema()}, - }, -} -``` - -** Matching people with similar plant suggestions and interests and create a matching community using sonar-deep-research ** -```python -payload_research = { - "model": "sonar-deep-research", - "messages": [ - { - "role": "system", - "content": "You are a plant growth expert. You are given a description of a place where an user want to grow some plants. You are also given latitude, longitude and altitude of the user. Your task is to suggest at most 5 plant that can be grown by the user in that particular place according to average weather.", - }, - { - "role": "user", - "content": f"I am standing in a place having coordinates [{lat}, {lng}] and altitude {alt}]. The place can be described as follows: {answer1}" - "Suggest at most five suitable plants that can be grown here.", - }, - ], - "stream": False, - "response_format": { - "type": "json_schema", - "json_schema": {"schema": Answer2.model_json_schema()}, - }, -} -``` - +* **Frontend**: React Native with Expo for cross-platform mobile development +* **Backend**: Python Flask API handling image processing and Perplexity API integration +* **AI Integration**: Perplexity Sonar Pro for image analysis and Sonar Deep Research for plant recommendations +* **Data Models**: Pydantic models for structured JSON output and data validation +* **Image Processing**: Real-time image analysis with location-based context +* **Community Features**: User matching based on plant suggestions and sustainable interests +* **Weather Integration**: Real-time weather data for accurate plant suitability assessment ## Links - [GitHub Repository](https://github.com/deepjyotipaulhere/greenify) -- [Live Demo](https://greenify.expo.app) - -![Live Demo URL](https://raw.githubusercontent.com/deepjyotipaulhere/greenify/refs/heads/master/assets/images/frame.png) -- Youtube Demo - -[![IMAGE ALT TEXT HERE](https://img.youtube.com/vi/IFP0EiHqd7Y/0.jpg)](https://www.youtube.com/watch?v=IFP0EiHqd7Y) \ No newline at end of file +- [Live Demo](https://greenify.expo.app) \ No newline at end of file