DURHAM – Maria Gorlatova, the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke University, has been awarded a prestigious National Science Foundation Faculty Early Career Development (CAREER) Award. The award supports outstanding young faculty members in their efforts to build a successful research enterprise.

For the next five years, the $550,000 award will support Gorlatova’s innovative work that uses nearby Internet of Things electronics to improve the performance of augmented reality (AR) devices. The research will lay the foundation for creating new experiences and abilities for AR ranging from smart homes to the operating room.

A woman stands in a suit holding a virtual reality headset

Maria Gorlatova is laying the foundations for creating new experiences and abilities for augmented reality devices

“Augmented reality can be generated with mobile phones or headsets like a HoloLens or Magic Leap, but there are currently all sorts of challenges no matter the platform,” said Gorlatova. “For example, virtual content does not stay in place like real objects, batteries drain too quickly or the devices can’t keep up computationally. We’re proposing to use external devices already in the environment to improve these issues.”Anyone who has ever played Pokémon Go or bought new furniture with a visualization app is familiar with the concept of AR. Pokémon Go overlays characters to collect on a device’s screen to make it look like Pikachu is standing on a street corner. Several furniture companies employ visualization apps that make it look like a new couch is already sitting in your living room’s empty corner.

But these simple visualizations only scratch the surface of what AR is capable of. Imagine having virtual artwork displayed on your walls to match your mood or to help cheer you up. Distracting objects could be covered up with a hologram while you work. When cooking a new dish, AR could highlight the next ingredient or object you’ll need in the process. AR could also help improve a home’s safety. For example, when your toddler is toddling around, an AR headset could warn of dangerous objects like a hot frying pan handle sticking out.

At an art museum, imagine seeing ancient Greek statues in the true vibrant colors they were originally painted in. In the medical field, imagine a surgeon about to remove a brain tumor who can “see” exactly where to cut thanks to a map created by a CAT scan virtually overlaid on the patient’s body in real-time.

All of these concepts are close to becoming reality, and Gorlatova wants to give their development a shot in the arm.

“We can take advantage of devices that are already generating some awareness so that AR doesn’t have to do everything,” said Gorlatova.

For example, Gorlatova points to robotic vacuum cleaners and smart lights. A robotic vacuum cleaner has already mapped out a home’s floor space so that it can complete its daily routine, so an AR device could build from that data rather than starting from scratch. All AR devices work best with well-lit spaces, both so that they can read the room accurately and so that their displays show up vividly. So why not tie the AR device into nearby smart lights to create the perfect lighting conditions?

User preferences are also important, of course, as ultimately AR is about helping the users, who have different goals and needs. Gorlatova’s research will focus on not just how to integrate data generated by existing smart devices into an AR experience, but how to optimize the user experience based on their preferences and activities.

“To get to these ideal experiences, we first have to lay the foundational work, which is truly what this project is aimed toward,” said Gorlatova. “We’ll be examining how to let AR devices and users both have some control over some of the environment, creating new metrics of user experience and device performance, and converting those to quantifiable improvements. We want to create a new understanding of AR systems so that we can make them significantly more reliable than they are now.”

(C) Duke University