Our Bodies, Our Data: Enter the Sensored Human

PinIt

Wearables and other sensors designed to collect data about humans are finding new applications across a number of fields.

Increasingly, much of the data shaping our world is coming from a very close-by source, our own bodies. Call it the “new human interface,” employing embedded technologies – such as AI-powered wearables, brain-sensing neurotech, and eye and movement tracking, which promises to provide greater insights into the things that make us tick.

That’s the view of Accenture, which published research that concludes that the world “is in the midst of a massive technology shift, as AI and other disruptive technologies become much more human-like and intuitive for people to use.” The result is greater productivity and accuracy in the workplace.

At the center of human-friendly computing is, of course, the human. Just about all the executives interviewed in the study, 94%, agree that human-interface technologies will help people “better understand behaviors and intentions, transforming human-machine interaction.”

The key is to better align technology with these behaviors and intentions in real time – which has implications for training and skills development. “We try to close the gap – learning new gestures, running focus group and A/B tests, taking motion sickness pills [for virtual reality], and training employees on new tech,” the Accenture authors note. “Large companies in 2022 spent about $1,689 per employee for overall training. But the fact is, when tech struggles to connect with us, it’s often because people – what they want, expect, or intend – are a black box.”

See also: How Wearables and Real-Time Data Prevent Fatigue-Related Accidents

An emerging constellation of technologies — focused on data coming directly from the body — may change that equation. The Accenture authors discuss the following developments:

  • Neurotech: “Researchers demonstrated using neural prostheses – like brain-computer interfaces – to decode speech from neural data,” the Accenture team says. “This could help patients with verbal disabilities ‘talk’ by translating attempted speech into text or generated voices. Interest in neurotech has reached enterprises, too. Researchers in Meta’s AI lab experimented with decoding speech from brain activity in 2022, using noninvasive brain recordings and an AI model to decode sentences and stories that patients heard out loud.”
  • Wearable devices: Smartwatches and fitness sensors are being leveraged “to measure subtle changes in people’s heart rate to predict their cognitive state. Researchers are exploring ways gaze-centered eye tracking in VR can grant users more control over a prosthesis. Here, following a person’s eye movements could help predict intent, like if they wanted to move in a certain direction or pick up a particular object.”
  • Empathetic vehicles: Researchers “are building more detailed ways to understand people’s intent in relation to their environments” – with an application intended to reduce human-vehicle collisions. While most crash-prevention technology simply serves to detect pedestrians, emerging technology could capture details such as “the distance between the vehicle and the pedestrian, the speed of the vehicle, and the pedestrian’s physical posture – like the angle between their thighs and calves or between their calves and the ground. Grasping a person’s posture while walking on a street can be a clue to discern where they are likely to move next, potentially making the roads safer for everyone.”
  • Human-robot collaborations. “People’s state of mind, like if they’re feeling ambitious or tired, can impact how they approach a task,” the Accenture authors state. “But while humans tend to be good at understanding these states of mind, robots aren’t.” Now, robots are being taught to identify these states to help them better assist people with a range of situations. “Researchers proposed a transfer learning system, where the robot observes someone performing a small, canonical task, then creates a preference model that it continuously updates as the robot and person interact.” Human-robot collaborations can also benefit “when robots can identify when they’ve made an error based on implicit reactions from humans they interact with – much like how people use social cues to recognize their own mistakes.”

Such human interface technologies could play a role “in improving employee training, leading to safer workplaces with less miscommunication and fewer accidents,” the authors state. “We could build digital products that, in better understanding and engaging people, reach a wider customer base. Imagine the productivity gains when we don’t need to contort ourselves around technology, but when it can work around us instead.”

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *