Walking About in Digital Twins Thanks to Augmented Reality

PinIt

A new project seeks to amplify the benefits of digital twins by giving users the ability to roam around via augmented reality.

Imagine walking around in your house to see where you may have left your keys – from 3,000 miles away. Significant progress is now being made in which people will be able to don lightweight augmented reality (AR) glasses that employ artificial intelligence to explore digital twins of rooms, buildings, systems, and objects in everyday life.

To this end, Aria Labs announced the development of Aria Digital Twin, an “egocentric” dataset captured using Aria glasses, with extensive simulated ground truth for devices, objects and environment.

This dataset accelerates research into a number of challenges including 3D object detection and tracking, scene reconstruction and understanding, sim-to-real learning, human pose prediction, while also inspiring new machine perception tasks for AR applications.

An egocentric dataset or network displays and maps connections from the perception of a single user. Project Aria is a research project out of Meta that developed AR glasses intended to help researchers gather information from the user’s perspective, contributing to the advancement of egocentric research in machine perception and AR.

See also: Digital Twins, IT/OT Convergence Drive the Industrial Internet

One of the first egocentric AR projects was announced in 2021, when the University of Bristol, as part of an international consortium of 13 universities in partnership with Facebook AI, collaborated to advance egocentric perception. Bristol researchers built what was then the world’s largest egocentric dataset using off-the-shelf, head-mounted cameras.

“Progress in the fields of artificial intelligence and augmented reality requires learning from the same data humans process to perceive the world,” according to a statement from the university. “Our eyes allow us to explore places, understand people, manipulate objects and enjoy activities — from the mundane act of opening a door to the exciting interaction of a game of football with friends.”

The ambitious project was inspired by the University of Bristol’s successful EPIC-KITCHENS dataset, which recorded the daily kitchen activities of participants in their homes. “In the not-too-distant future you could be wearing smart AR glasses that guide you through a recipe or how to fix your bike – they could even remind you where you left your keys,” said Dima Damen, professor at Bristol.

In its latest demonstration project, the Aria Digital Twin (ADT) dataset was captured in two different locations within Meta offices in North America. The ADT release contains 200 sequences of real-world activities conducted by Aria wearers in two real indoor scenes with 398 object instances – 324 stationary and 74 dynamic. Each sequence includes raw data of two monochrome camera streams, one RGB camera stream, two IMU streams; and complete sensor calibration.

The Aria researchers seek to “set a new standard for evaluation in the egocentric machine perception domain, which includes very challenging research problems such as 3D object detection and tracking, scene reconstruction and understanding, sim-to-real learning, human pose prediction – while also inspiring new machine perception tasks for augmented reality applications.”

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *