SHARE
Facebook X Pinterest WhatsApp

Walking About in Digital Twins Thanks to Augmented Reality

thumbnail
Walking About in Digital Twins Thanks to Augmented Reality

Various graphs and connectivity points against girl wearing virtual reality glasses

A new project seeks to amplify the benefits of digital twins by giving users the ability to roam around via augmented reality.

Written By
thumbnail
Joe McKendrick
Joe McKendrick
Aug 21, 2023

Imagine walking around in your house to see where you may have left your keys – from 3,000 miles away. Significant progress is now being made in which people will be able to don lightweight augmented reality (AR) glasses that employ artificial intelligence to explore digital twins of rooms, buildings, systems, and objects in everyday life.

To this end, Aria Labs announced the development of Aria Digital Twin, an “egocentric” dataset captured using Aria glasses, with extensive simulated ground truth for devices, objects and environment.

This dataset accelerates research into a number of challenges including 3D object detection and tracking, scene reconstruction and understanding, sim-to-real learning, human pose prediction, while also inspiring new machine perception tasks for AR applications.

An egocentric dataset or network displays and maps connections from the perception of a single user. Project Aria is a research project out of Meta that developed AR glasses intended to help researchers gather information from the user’s perspective, contributing to the advancement of egocentric research in machine perception and AR.

See also: Digital Twins, IT/OT Convergence Drive the Industrial Internet

One of the first egocentric AR projects was announced in 2021, when the University of Bristol, as part of an international consortium of 13 universities in partnership with Facebook AI, collaborated to advance egocentric perception. Bristol researchers built what was then the world’s largest egocentric dataset using off-the-shelf, head-mounted cameras.

“Progress in the fields of artificial intelligence and augmented reality requires learning from the same data humans process to perceive the world,” according to a statement from the university. “Our eyes allow us to explore places, understand people, manipulate objects and enjoy activities — from the mundane act of opening a door to the exciting interaction of a game of football with friends.”

The ambitious project was inspired by the University of Bristol’s successful EPIC-KITCHENS dataset, which recorded the daily kitchen activities of participants in their homes. “In the not-too-distant future you could be wearing smart AR glasses that guide you through a recipe or how to fix your bike – they could even remind you where you left your keys,” said Dima Damen, professor at Bristol.

In its latest demonstration project, the Aria Digital Twin (ADT) dataset was captured in two different locations within Meta offices in North America. The ADT release contains 200 sequences of real-world activities conducted by Aria wearers in two real indoor scenes with 398 object instances – 324 stationary and 74 dynamic. Each sequence includes raw data of two monochrome camera streams, one RGB camera stream, two IMU streams; and complete sensor calibration.

The Aria researchers seek to “set a new standard for evaluation in the egocentric machine perception domain, which includes very challenging research problems such as 3D object detection and tracking, scene reconstruction and understanding, sim-to-real learning, human pose prediction – while also inspiring new machine perception tasks for augmented reality applications.”

thumbnail
Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Recommended for you...

Fastvertising: What It Is, Why It Matters, and How Generative AI Amplifies It
The Most Important Question in Operational AI: Show Me Where It Actually Works
Stephen Ochs
Jan 21, 2026
Model-as-a-Service Part 1: The Basics
If 2025 was the Year of AI Agents, 2026 will be the Year of Multi-agent Systems

Featured Resources from Cloud Data Insights

The $5 Trillion Blindspot: When Robots Run Faster Than Your Dashboards
Chris Willis
Feb 2, 2026
Real-time Analytics News for the Week Ending January 31
The Foundation Before the Speed: Preparing for Real-Time Analytics
Why AI Needs Certified Carrier Ethernet
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.