Developers can now integrate the accessibility feature into their apps, allowing users to control the cursor with facial gestures or by moving their heads. For example, they can open their mouth to move the cursor or raise their eyebrows to click and drag.
Announced during last year’s Google I/O for desktop, Project Gameface uses the device’s camera and a database of facial expressions from MediaPipe’s Face Landmarks Detection API to manipulate the cursor.
“Through the device’s…