Magic Hand

Meeting assistance that turns intuitive gesture into mouse input.

Abstraction

We developed a meeting assistant tool that enables the presenter to control the cursor with intuitive arm movement. We implement the functionality of content highlighting and laser pointer manipulation. Besides, we developed an auxiliary cube that enhanced the interaction between participants.

Fig. 1. Components of our projects.

This is the link for a demonstration of several functions. We used YOLO v5 to capture body position, shown in Fig. 2, and got the depth map from the Realsense installed on the projector, shown in Fig. 3.

Fig. 2. The body pose is captured by the YOLO v5 model.
Fig. 3. The depth map is captured by the realsens install on the projector. Left: Schematic figure. Right: raw captured depth map.

We apply several filters for output stability, like the moving average filter. Besides, we expand the controlling region from the points captured by the YOLO model to the area of the forearm. As for the cube, we use IMU and vibrating modules to fulfill interactive tasks, like picking a participant by enabling the vibrating motor or anonymous voting by putting different faces down. Shown in Fig. 4.

Fig. 4. Functions of the cube.

We use the NXP MCU board, which embedded large touch screen, serves as the control panel of changing pointer’s function and sending tasks to cubes. Shown in Fig. 5.

Fig. 5. Functions of the NXP board.