Walking About in Digital Twins Using Augmented Reality

Walking About in Digital Twins Thanks to Augmented Reality

Walking About in Digital Twins Thanks to Augmented Reality

Various graphs and connectivity points against girl wearing virtual reality glasses

A new project seeks to amplify the benefits of digital twins by giving users the ability to roam around via augmented reality.

Written By
Joe McKendrick
Joe McKendrick
Aug 21, 2023
2 minute read

Imagine walking around in your house to see where you may have left your keys – from 3,000 miles away. Significant progress is now being made in which people will be able to don lightweight augmented reality (AR) glasses that employ artificial intelligence to explore digital twins of rooms, buildings, systems, and objects in everyday life.

To this end, Aria Labs announced the development of Aria Digital Twin, an “egocentric” dataset captured using Aria glasses, with extensive simulated ground truth for devices, objects and environment.

This dataset accelerates research into a number of challenges including 3D object detection and tracking, scene reconstruction and understanding, sim-to-real learning, human pose prediction, while also inspiring new machine perception tasks for AR applications.

An egocentric dataset or network displays and maps connections from the perception of a single user. Project Aria is a research project out of Meta that developed AR glasses intended to help researchers gather information from the user’s perspective, contributing to the advancement of egocentric research in machine perception and AR.

See also: Digital Twins, IT/OT Convergence Drive the Industrial Internet

One of the first egocentric AR projects was announced in 2021, when the University of Bristol, as part of an international consortium of 13 universities in partnership with Facebook AI, collaborated to advance egocentric perception. Bristol researchers built what was then the world’s largest egocentric dataset using off-the-shelf, head-mounted cameras.

“Progress in the fields of artificial intelligence and augmented reality requires learning from the same data humans process to perceive the world,” according to a statement from the university. “Our eyes allow us to explore places, understand people, manipulate objects and enjoy activities — from the mundane act of opening a door to the exciting interaction of a game of football with friends.”

The ambitious project was inspired by the University of Bristol’s successful EPIC-KITCHENS dataset, which recorded the daily kitchen activities of participants in their homes. “In the not-too-distant future you could be wearing smart AR glasses that guide you through a recipe or how to fix your bike – they could even remind you where you left your keys,” said Dima Damen, professor at Bristol.

In its latest demonstration project, the Aria Digital Twin (ADT) dataset was captured in two different locations within Meta offices in North America. The ADT release contains 200 sequences of real-world activities conducted by Aria wearers in two real indoor scenes with 398 object instances – 324 stationary and 74 dynamic. Each sequence includes raw data of two monochrome camera streams, one RGB camera stream, two IMU streams; and complete sensor calibration.

The Aria researchers seek to “set a new standard for evaluation in the egocentric machine perception domain, which includes very challenging research problems such as 3D object detection and tracking, scene reconstruction and understanding, sim-to-real learning, human pose prediction – while also inspiring new machine perception tasks for augmented reality applications.”

Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Recommended for you...

Why the Best MSPs Are Starting to Rethink Cloud Strategy (Without Making a Big Deal About It)
Richard Copeland
Apr 24, 2026
English as Code and the End of Drag-and-Drop Thinking
Binny Gill
Apr 22, 2026
Why Digitizing Leadership Standard Work Is No Longer Optional in Manufacturing
Renato Basso
Apr 17, 2026
Building AI Operations: A Practical Guide
Robin Kamen
Apr 16, 2026

Featured Resources from Cloud Data Insights

AI Agents Need More Than Models to Work in the Real World
Uri Knorovich
Apr 28, 2026
Why Storage is Becoming the Limiting Factor in AI Infrastructure
Ken Claffey
Apr 27, 2026
Real-time Analytics News for the Week Ending April 25
Smart Manufacturing Trends 2026: How AI, IoT, and Automation Are Driving Efficiency and Resilience
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.