Using your thumbs is so last year.
There’s a new way to interact with your iPhone, and all it takes is your eyes.
While attending Apple’s WWDC as a scholarship student, Matt Moss had a chance to play around with the iOS 12 developer beta. In the process, he came to a neat realization: ARKit 2.0 opened up some unexpected possibilities.
“I saw that ARKit 2 introduced eye tracking and quickly wondered if it’s precise enough to determine where on the screen a user is looking,” he explained over Twitter direct message. “Initially, I started to build the demo to see if this level of eye tracking was even possible.” Read more…