We could be able to control our smart homes with just our eyes in the future.

Photo of author
Written By Hassan Zaka

I'm an experienced expert in accounting and technical writing, skilled at simplifying complex topics with engaging visuals. Ideal for any organization in need of top-tier technical writing services.

Voice control isn’t always sufficient. Imagine being able to turn up the heat, dim the lights, and play your Ed Sheeran or Dua Lipa playlist with only your eyes and a flick of your wrist.

Perhaps one day. EyeMU, a gaze-tracking technology created by Carnegie Mellon University in Pittsburgh, allows users to manage smartphone apps–including streaming music services–with their eyes and basic hand motions. There’s no need for a touchscreen.

To allow instructions, the Future Interfaces Group, which is part of the school’s Human-Computer Interaction Institute, paired a gaze predictor with the motion sensors on a smartphone.

To put it another way, look at a notice to lock it in, then flip the phone to the left to dismiss it or to the right to react. Alternatively, you may move the phone closer to expand a picture or further away to disable the gaze control. This frees up one hand for other chores, such as sipping your cappuccino.

Google’s adaption, Look to Speak, free software that was recently featured in an Oscars commercial, is an eyes-only technology intended for persons with impairments. You can observe how EyeMU’s basic hand motions can make a difference by downloading the Android-only app.

“Big tech firms like Google and Apple have gotten fairly close with gaze prediction,” Chris Harrison, head of the Future Interfaces Group, says, “but simply glancing at something doesn’t get you there.” The significant breakthrough in this study is the addition of a second modality, such as flipping the phone left or right, in conjunction with gaze prediction. That is what makes it so effective. In retrospect, everything seems so apparent.”

It’s proven difficult to get gaze analysis and prediction to correctly manage a smartphone. Andy Kong, a senior computer science major at Carnegie Mellon, created software that utilizes a laptop’s camera to monitor the user’s eyes and then controls on-screen cursor movement as an alternative to commercial eye-tracking systems. EyeMU was built on this foundation.

“Right now, phones only answer when we ask for something, whether it’s through speech, taps, or button presses,” Kong explains. “Imagine how much more beneficial it would be if we could forecast what the user desired by studying look or other biometrics if the phone is widely used now.”

More:

 
0/5 (0 Reviews)

Leave a Comment