• +598 29008192
  • info@servinfo.com.uy

Ebay’s HeadGaze lets motor-impaired users navigate the site with head movements

Ebay’s HeadGaze lets motor-impaired users navigate the site with head movements

The sophisticated head-tracking system like the one built into the iPhone X may have been intended for AR and security purposes, but it may also turn out to be very useful for people with disabilities. A proof of concept app from an eBay intern shows how someone with very little motor function can navigate the site with nothing but head movements.

Muratcan Çiçek is one such person, and relies on assistive technology every day to read, work and get around. This year he was interning at eBay and decided to create a tool that would help people with motor impairments like his to shop online. Turns out there are lots of general-purpose tools for accessibility, like letting a user control a cursor with their eyes or a joystick, but nothing made just for navigating a site like eBay or Amazon.

His creation, HeadGaze, relies on the iPhone X’s front-facing sensor array (via ARKit) to track the user’s head movements. Different movements correspond to different actions in a demonstration app that shows the online retailer’s daily deals: navigate through categories and products by tilting your head all the way in various directions, or tilt partway down to buy, save or share.

You can see it in action in the short video below:

It’s not that this is some huge revolution in interface — there are some apps and services that do this, though perhaps not in such a straightforward and extensible way as this.

But it’s easy to underestimate the cognitive load created when someone has to navigate a UI that’s designed around senses or limbs they don’t have. To create something like this isn’t necessarily simple, but it’s useful and relatively straightforward, and the benefits to a person like Çiçek are substantial.

That’s probably why he made the HeadGaze project open source — you can get all the code and documentation at GitHub; it’s all in Swift and currently only works on the iPhone X, but it’s a start.

Considering this was a summer project by an intern, there’s not much of an excuse for companies with thousands of developers to not have something like it available for their apps or storefronts. And it’s not like you couldn’t think of other ways to use it. As Çiçek writes:

HeadGaze enables you to scroll and interact on your phone with only subtle head movements. Think of all the ways that this could be brought to life. Tired of trying to scroll through a recipe on your phone screen with greasy fingers while cooking? Too messy to follow the how-to manual on your cell phone while you’re tinkering with the car engine under the hood? Too cold to remove your gloves to use your phone?

He and his colleagues are also looking into actual gaze-tracking to augment the head movements, but that’s still a ways off. Maybe you can help.


Source: TechCrunch

Belen De Leon