Magic Hand
Meeting assistance that turns intuitive gesture into mouse input.
Abstraction
We developed a meeting assistant tool that enables the presenter to control the cursor with intuitive arm movement. We implement the functionality of content highlighting and laser pointer manipulation. Besides, we developed an auxiliary cube that enhanced the interaction between participants.

Details & Links
This is the link for a demonstration of several functions. We used YOLO v5 to capture body position, shown in Fig. 2, and got the depth map from the Realsense installed on the projector, shown in Fig. 3.



We apply several filters for output stability, like the moving average filter. Besides, we expand the controlling region from the points captured by the YOLO model to the area of the forearm. As for the cube, we use IMU and vibrating modules to fulfill interactive tasks, like picking a participant by enabling the vibrating motor or anonymous voting by putting different faces down. Shown in Fig. 4.


We use the NXP MCU board, which embedded large touch screen, serves as the control panel of changing pointer’s function and sending tasks to cubes. Shown in Fig. 5.
