ELECTRONICS
LeapPad Inventor Wants to Turn Your Eyes Into a Mouse
LeapPad Inventor Wants to Turn Your Eyes Into a Mouse

HIGHLIGHTS

  • Eye-based technology is key to improving and popularizing VR technology
  • System designed to help alleviate the nausea experienced by some VR users
  • Eyefluence is currently hammering out licensing deals with headset makers

First came the computer mouse. Then the touchscreen. Now the tech industry is looking for a new human-machine interface-this time, one that will make virtual-reality headsets as mainstream as personal computers and smartphones.

The man who invented the LeapPad tablet for kids is betting the killer app is right in front of your face: the eyes.

Jim Marggraff’s startup, Eyefluence, has developed technology that knows where people are looking and lets them manipulate objects the way we do now by clicking a mouse or tapping an icon. Besides fostering a more natural and immersive experience, the system is designed to help alleviate the nausea experienced by some VR users and enhance security with iris scans.

Marggraff says most of the big headset makers have expressed interest in licensing the technology from his Milpitas, California, company. Motorola Solutions, a leading investor, is testing the technology for emergency responders and sees possibilities in mining and medicine. “You’re able to basically interact in the virtual world simply by looking at what you want to interact with,” says X Prize Foundation founder Peter Diamandis, who advises Eyefluence and has seen the technology.

Most VR headsets use some sort of handheld device (like a game controller) or head movements to navigate. These techniques are far from ideal because they require a lot of moving around that gets tiring, and industry heavyweights including Facebook chief Mark Zuckerberg agree that some form of eye-based technology is key to improving and popularizing VR technology. Gaming enthusiasts also say eye-tracking would significantly improve the playing experience by making it easier to follow objects and interact with characters.

Already, a host of companies with names like SensoMotoric Instruments, Tobii and The Eye Tribe are working on eye-tracking devices that could plug into a range of headsets. Another company called Fove says it will soon have the first headset featuring a version of the technology. But most of these are limited to scanning the iris for security purposes and recognizing where the user is looking-pointing but not clicking.

Eyefluence takes it one step further, not only using the eyes as a cursor, but letting them select, zoom, pan-things now accomplished by clicking and double-clicking a mouse or tapping and pinching a touchscreen.

The eyes are the fastest moving organ, capable of shifting 900 degrees per second, which means Eyefluence’s interface software makes it much faster to tell the computer what to do. The technology makes a two- or even three-step thought process into one: say, from look at object, move hand to it and tap or click to simply look. “It almost feels magical, like the system knows what you want before you tell it,” Diamandis says. “It’s almost like it’s reading your mind.”

When Marggraff announced plans to turn the human eye into a computer mouse, skeptics said it couldn’t be done. “People told me, ‘Don’t bother: you consume information with your eyes, but if you try to simultaneously use your eyes to control things, there will be a collision between controlling and directing and consuming,” he recalls.

But Marggraff has a deserved reputation for knowing a good idea when he sees one. He came up with the LeapPad even before the iPod existed. Released in 1999, the device helped kids learn how to read, was for a time the most popular toy in the US and generated more than $1 billion (roughly Rs. 6,717 crores) in revenue within five years.

Marggraff, 58, also created the Livescribe, one of the better-known smart pens that record audio and sync it to written notes. Eyefluence co-founder Dave Stiehr, 57, helped build several medical-device companies, including one that sold automatic external defibrillators-gadgets that diagnose and correct uneven heartbeats and are now standard equipment in many movie theaters, airplanes and offices.

In 2012, Marggraff bought all the assets of Eye-Com, a research company led by neurologist William Torch and funded by the National Institutes of Health, the Department of Transportation and the Department of Defense. Torch researched everything from fatigue to blinking and had accumulated more than a decade’s worth of data on eyes as well as built technology to track and analyze the organ. Marggraff and his team used that foundation to create an interactive system that uses looks to direct a computer.

It’s not as easy as it sounds. Eyefluence’s team had to design a way that feels natural for the eyes to communicate a person’s intentions to the machine. Looking at an object for a set amount of time to “click” it, for instance, wouldn’t work: it’s too taxing on the eyes, and it would take too much active thinking to remember to stare for a certain amount of time or not stare for too long for fear of accidentally selecting it. A successful eye-interaction method would have to be just as intuitive as moving a mouse or tapping an app.

In the end, Marggraff solved the problem by embracing aspects of how the human eye works that he’d originally seen as a limitation. Even though it feels like our eyes are providing a constant stream of visual information, there are actually interruptions to the feed when we move our eyes to look at or examine something else.

In those moments, we are essentially blind and the brain fills in the missing images. Using these kinds of movements as a foundation, Eyefluence built an eye-machine interface that acts on intentional looks and ignores incidental ones. The company declined to explain further how the software works, citing concerns about maintaining a competitive edge.

Eyefluence is currently hammering out licensing deals with various headset makers to include its hardware and software in their devices, though the company declined to discuss specifics, citing non-disclosure agreements. Marggraff did say that the company is working with the “major players” and has been “overloaded with the strong amount of business partnerships and interest.”

Motorola Solutions, which led Eyefluence’s most recent fundraising round of $14 million, has been trying out the technology for more practical applications. The company, which sells equipment to first responders and other public safety and government agencies, sees Eyefluence as essential to providing police and firefighters with information via headset in a hands-free way that won’t interfere with their jobs.

“The more stress you’re under, the less cognitive ability you have to devote to menial tasks, so when you need the tech the most, that’s when you have the least capacity you have to extract what you need,” says Chief Technology Officer Paul Steinberg. “By using the very simple and quick interactions, we can lessen cognitive requirements to use the technology.”

Steinberg also foresees using Eyefluence’s technology in headsets for industries such as mining where drones will probably become ubiquitous. To test the eye-tracking and interface, Motorola Solutions has built and tested a virtual command center for emergency responders using Facebook’s Oculus Riftheadset, where users can see camera feeds from field operatives and control what they’re looking at and zoom in and out using their eyes. The next iteration could use Eyefluence technology to develop gaze-matching, where a person in the command center will be able to not only follow where the field operatives are looking, but direct them to check out other things.

The learning curve was almost nonexistent, which surprised Steinberg because he’d assumed using looks to manipulate things would present a challenge. And he says users generally retained their knowledge. However, Motorola Solutions probably won’t have a product ready next year, largely because headsets need to improve first.

Diamandis predicts Eyefluence technology will popularize VR the way the mouse once did the personal computer: “Everyone who’s seen it has completely gotten the power.”

© 2016 Bloomberg L.P.

Tags: Eyefluence, VR Headset, Virtual Reality, VR, Wearables, Gaming, PC, Laptop

 

[“Source-Gadgets”]

About the author

Related Post