Research
Fuzhou University 28th Undergraduate Scientific Research Training Program (SRTP) Innovative Training Project
“Smart Electronic Waste Community Recycling Station”
- Project Number: 28340
- College: Maynooth International Engineering College
- First-Level Subject: Computer Science and Technology
- Subject Level 2: Artificial Intelligence
- Project Period: From May 2022 to May 2023
- Project Members: Yixin Chen, Miaolan Zhou, Zhipeng Wang, Yixuan Gao
- Instructor: Associate Professor Hui Qian
Project Description
This self-selected proposition aims to solve the problems of high labor costs and low classification efficiency in electronic waste recycling. The project focuses on automatically identifying and valuing electronic waste.
As the project leader, I applied for this school-level SRTP and took on the main responsibilities of the project. We used the MindSpore framework to implement CNN-based image recognition and classification, achieving over 98% success in identifying items such as mobile phones and keyboards. Additionally, we implemented mobile phone price predictions based on the LightGBM algorithm. Hyperparameter optimization was achieved using random search and cross-validation techniques. The results showed significant improvements over previous models, with the following evaluation metrics:
- Percentage Error: 5.46%
- MAE (Mean Absolute Error): 103.87
- MSE (Mean Squared Error): 41224.03
- RMSE (Root Mean Squared Error): 203.04
Graduation Project for the 2024 Undergraduate Students at Fuzhou University
“Research on Hands-Free Method of Virtual Target Operation Based on Eye Gaze”
- College: School of Physics and Information Engineering
- Project Period: From January 2024 to June 2024
- Instructor: Yi Lin
Project Goal
The project aims to develop a hands-free target manipulation technology based on eye movement, offering a natural way of interaction with quick response times and strong user initiative. This technology eliminates the need for physical devices and allows users to interact with virtual objects in a virtual reality (VR) or augmented reality (AR) environment.
Eye-movement-based interaction significantly reduces usage restrictions in mobile scenarios and enhances the convenience and naturalness of human-computer interaction. The study will focus on hands-free target manipulation and will have practical value for expanding future interactive operations.
Main Design Content and Features
Currently, users in VR or AR environments typically rely on handheld controllers for target selection and manipulation. This can be problematic for specific user groups or particular situations where the controller is not ideal. This project will develop a hands-free method of target manipulation based on eye gaze, leveraging eye-tracking technology to improve performance.
Key components of the project include:
- Target Operation Method: Combining eye gaze and gestures to rapidly position and manipulate virtual targets.
- VR Scene Design: Developing a VR warehouse scene for testing the target operation method and collecting user operation data.
- Comparison Study: Comparing the eye gaze method with other hands-free operations and traditional controller-based methods.