Google’s open-source AI tool let me play my favorite Dreamcast game with my face

 

Google’s open-source AI tool let me play my favorite Dreamcast game with my face

Project Gameface is ready to install as a Windows app that makes gaming more accessible using only your webcam.
Simple clipart-style graphic of a partial smiley face next to a computer monitor and mouse.
The homescreen for Project Gameface is so cheerful.
 Screenshot by Wes Davis / The Verge

While Wednesday’s Google I/O event largely hyped the company’s biggest AI initiatives, the company also announced updates to the machine learning suite that powers Google Lens and Google Meet features like object tracking and recognition, gesture control, and of course, facial detection. The newest update enables app developers to, among other things, create Snapchat-like face filters and hand tracking, with the company showing off a GIF that’s definitely not a Memoji.

This update underpins a special project announced during the I/O developer keynote: an open-source accessibility application called Project Gameface, which lets you play games... with your face. During the keynote, Google played a very Wes Anderson-esque mini-documentary revealing a tragedy that prompted the company to design Gameface.

Game streamer Lance Carr, who goes by the name GimpyG and whose rare form of muscular dystrophy has left him unable to use his hands, was in the middle of streaming a Hearthstone session when a fire started in his garage. He was able to vacate before it spread to the rest of his home, but unfortunately, the gear that let him enjoy his favorite pastime — his head-tracking mouse and gaming PC — had to remain behind and was destroyed.

Replacing this stuff isn’t cheap. Head-tracking mouse gear can cost multiple hundreds of dollars, and that’s to say nothing of his gaming setup. Google’s new software aims to remove one of those costly barriers.

The company says it worked with Carr to design a piece of software that would allow anyone with a webcam to control their computer with head movement and gestures, all translated to the screen by the Windows-only software.

I tested it out, and it’s very cool, even in its nascent state. Installing it isn’t quite as simple as downloading an installation program — it’s on GitHub, after all. You’ll need to download the 0.3.30 release from the right side of the screen, extract the resulting .zip file, and open the app by navigating to it in the Windows run.exe program browse menu. Easy, see?

Once you’ve opened the app, however, the interface is refreshingly simple. The homescreen presents four options. There’s Camera, where you’ll select your webcam. Cursor Speed lets you customize both the mouse speed (in each of four directions), mouse pointer jitter, flicker reduction, and even how long a gesture needs to be held before triggering an action. Two more menus allow you to bind gestures to various mouse and keyboard buttons and other actions, including recentering the mouse, which is extremely necessary because it loses track of the center a lot.

A screenshot of the cursor speed screen
The configuration options are surprisingly robust.
 Screenshot: Wes Davis

I knew immediately which game I wanted to try it with: one of my all-time favorites, RezRez is an on-rails shooter that debuted on the Dreamcast in the heady days of the early aughts, and it has just three in-game actions: moving a cursor; firing your weapon; and triggering the oh-crap-panic-shoot-everything special ability. (That the game’s battle to infiltrate an advanced AI is thematically appropriate for Google I/O week somehow didn’t occur to me until just now as I’m writing this.)

It took some tinkering, but after three tries with Gameface’s settings, I was able to dial in the sensitivities for all the gestures, and I dove in. The learning curve wasn’t as hard as I expected! Turning my head to move the reticle around and popping my mouth open to hold down the fire button while I select my targets was easy (and surprisingly low latency), and recentering the cursor quickly became second nature, maybe because I had to use it so often. I wasn’t nearly as good at it as I am with a traditional keyboard and mouse setup for the Steam version, but I could see getting there.

Gameface is very neat, but currently, it’s also limited. With only six facial gestures to control, you run out of inputs quickly — don’t expect to play intense, convoluted FPS games with it anytime soon. But I could see combining it with something like a voice input app to achieve greater control. And who knows? Maybe Google will add more gesture recognition options down the line to give players more ways to play.

Post a Comment

Please Select Embedded Mode To Show The Comment System.*

Previous Post Next Post