From AI World: Why Leveraging Edge Data Requires A Lot of Energy


A fireside chat at AI World discussed the issues encountered in extracting data from the edge.

One may be forgiven for assuming manufacturers are ready to crack open their various production systems and begin leveraging the data that has been trapped within. The reality can be much more complicated.

The issues encountered in extracting data from the edge were explored in a fireside chat with Joseph Etris, engineering project manager with Continental Automotive Systems, joined by John Auld, regional sales director at Zededa, Inc. I had the opportunity to moderate the session, part of the recent AI World conference in Boston.

See also: Manufacturing Leads the IoT Pack

Etris’ unit at Continental manufactures catalytic converters for the automotive sector — an energy-intensive task. In 2018, Etris and his team set out to capture, analyze and mitigate electricity usage and related emissions across its range of Siemens programmable logic controllers (PLCs), first installed in the 1990s. The challenge, Etris explained, was extracting the SCADA data from the PLCs to integrate with data emanating from the newer electrical sensors.

While many vendors claim that older industrial systems can readily easily give up their data in a perfect IoT world, this is far from the case, Etris relates. Legacy production systems often were not designed to be open, and often are not network-enabled.

Auld, who represented Siemens at the time, worked with the Continental team to extract the PLC data and provide real-time calibration. In addition, the data is uploaded every 10 minutes to a cloud repository on Amazon Web Services.

The challenges Continental encountered is symptomatic of the issues being encountered across the manufacturing landscape, Auld explains. “There’s a lot of dark data that exists in silos within many companies.” Not only does this present technical challenges, but organizational ones as well, as multiple teams need to be on board with AI over IoT efforts.

Managing data storage in the AI and IoT era was another area discussed by panelists. Many organizations will be struggling with growing volumes of data streaming into their systems from various arrays of controllers and sensors, and often end up depositing it into data lakes.

Manufacturers need to establish a common interface layer that can extract and read data from all generations of production systems, Auld said. In addition, consistency needs to be assured, as data from various systems may have different meanings and thresholds. The challenge for many manufacturers is they may have many separate factories, each with its own initiatives and standards.


About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *