EcoBin is an innovative AI-powered smart waste bin aimed at addressing the challenges of waste management. In this trial implementation, an ESP32 microcontroller is used to integrate core functionality, while the YOLOv5 object detection model runs on a connected laptop to classify waste, especially plastic bottles using webcam input. This approach serves as a proof-of-concept for potential future implementations where the classification process is entirely embedded in a microcontroller like STM32, paired with a camera module. By exploring this transitional design, EcoBin paves the way for a scalable, standalone, and cost-effective solution for waste sorting. This research highlights the integration process, model training, hardware configurations, and testing outcomes, showcasing EcoBin’s potential to enhance waste management systems and sustainability efforts.
The rapid growth in waste production, driven by modern consumption patterns, has underscored the urgent need for effective waste management solutions. Improper sorting remains a critical issue, leading to recyclable materials being sent to landfills and exacerbating environmental degradation. EcoBin combines AI-driven object detection and edge computing technologies to create a practical and affordable waste sorting system.
In this trial phase, the system architecture leverages an ESP32 microcontroller for hardware integration while using a laptop for real-time object detection with the YOLOv5 model. Unlike similar projects like WasteNet that achieved high accuracy but lacked practical implementation, EcoBin focuses on establishing a realistic pathway towards a standalone system. Future iterations will aim to embed the AI processing capabilities directly into microcontrollers such as STM32, paired with compact camera modules, for complete independence from external computing resources.
The EcoBin trial implementation follows a structured development process:
System Design:
The trial system consists of an ESP32 microcontroller, a laptop running the YOLOv5 object detection model, and a webcam for capturing live input. The ESP32 controls the servo motors responsible for sorting waste based on the classification results provided by the laptop. This architecture bridges the gap between current capabilities and future embedded solutions.
Dataset Collection and Model Training:
A dataset of waste images was compiled, categorized into the needed hand signs, and plastic bottles. The YOLOv5 model was trained using this dataset with an 80-10-10 split for training, validation, and testing. The trained model is deployed on the laptop for real-time inference.
Hardware Integration:
The ESP32 microcontroller communicates with the laptop via a serial connection. Based on the classification output, the ESP32 triggers servo motors to sort waste into designated bins. This trial implementation explores the feasibility of replacing the laptop with a standalone microcontroller-based solution in future iterations.
Testing and Evaluation:
The system was tested with various waste types to evaluate classification accuracy, response time, and mechanical reliability. Results were compared against target benchmarks to identify areas for improvement.
The experiments focused on the classification and separation of plastic bottles. A webcam from to the laptop provided real-time image input to the YOLOv5 model, which identified objects with high confidence. If a plastic bottle was detected, and a specific hand gesture (a rock sign) was identified, the ESP32 microcontroller sent a signal to open the trash can's lid. The initial setup included a single container designated for plastic bottles. The experiments simulated real-world scenarios to test detection reliability, signal communication, and mechanical performance. The implementation is available on GitHub.
Furthermore, the envisioned future system features two containers. In this setup, a rotating lid would direct plastic bottles to one container and other waste to the second. The experiments provided insights into the practical requirements for implementing this mechanism, including speed and accuracy improvements for waste classification and servo control.
The trial implementation achieved successful detection and separation of plastic bottles when paired with the specified hand gesture. The YOLOv5 model demonstrated high confidence rates for plastic bottle detection, and the ESP32 reliably controlled the servo mechanism to open the lid. However, challenges such as occasional misclassification under poor lighting conditions and misclassification for untrained items were observed.
Preliminary tests of the two-container system design indicate feasibility, with the rotating lid mechanism showing potential for effective operation. Future iterations will focus on addressing detection limitations and optimizing mechanical response to ensure seamless operation in real-world conditions.
The trial implementation of EcoBin demonstrates the feasibility of integrating AI-powered waste sorting with edge devices. While the current system relies on a laptop for running the YOLOv5 model, this setup serves as a step towards fully embedded solutions. The insights gained from this trial will inform future development, focusing on deploying AI models directly on microcontrollers like STM32 which has the processing power to run YOLOv3, paired with compact camera modules. EcoBin’s modular and scalable design holds promise for addressing waste management challenges, enhancing recycling efforts, and contributing to a sustainable future.