Safe, Secure, and Commercial Cobots for the Post-COVID Workplace

PinIt

COVID-19 is driving the demand for Collaborative robots (Cobots), which are designed to interact with humans in a shared space or to work safely in close proximity.

Factories are increasingly being challenged to be nimble and adept. Instead of creating hundreds, thousands, or even millions of the same thing, the new paradigm is flexibility allowing swift reconfiguration of equipment to support different builds. This might involve information that is extremely sensitive, which could result in original equipment manufacturers (OEMs) writing some or all the associated manufacturing software themselves. For the company whose robot is building a specific product, they need to be able to integrate the OEM’s software and ensure that it cannot interfere with the robot’s underlying control software.

See also: IBM Robot Returns To International Space Station

Facing a real opportunity to revolutionize certain factory workflows and generate new “as a service” revenue streams, robotics is at an extremely exciting point right now. The traditional industrial robotics sector featuring leaders like Mitsubishi, ABB Robotics, Omron, and Fanuc needs to adapt to the new paradigm as it will face both new opportunities and new competition addressing diverse additional use cases that demand more mobility, new types of sensor technology, and price points. We are seeing new levels of dexterity in the way robots handle delicate or oddly shaped objects, and I fully envision a day when robots will be able to support all five senses. I will show that the emerging robot paradigm will feature consolidated designs with multiple features sharing a heterogeneous multicore chip, and that they will, of course, also be connected. I will look at the implications this has for the design, and specifically the software design, of these robots. 

Cobots address the post-COVID workplace

The emergence of Cobots, or collaborative robots, intended to interact with humans in a shared space or to work safely in close proximity is an important development. Cobots stand in stark contrast to traditional industrial robots, which work autonomously with safety assured by isolation from human contact. This is currently a very small percentage of the overall robotics market but is an area we (and a number of recognized analysts) believe will grow rapidly in the next five years for applications in manufacturing, healthcare, and retail.

The COVID-19 pandemic is accelerating this trend, providing more acceptance toward increased digital transformation and automation. Environments where maintaining six feet of separation between humans is challenging or incredibly inefficient will start to deploy this technology. Large stores like Walmart are already using robots to clean the floors of stores, and I think we will see increased usage of oversized roombas cleaning schools, businesses, and hospitals. With the predicted near-term slowdown in the race for self-driving vehicles, I believe many technology companies in this space will pivot towards these new use cases, bringing exciting leading-edge technology into the robotics field.

As Cobots are placed in the vicinity of humans, this is a segment where the equipment must work all the time in a deterministic way. The safety of the human working alongside the Cobot is of paramount importance. Safety certifications and processes need to be defined and strictly observed if this market is to form successfully. This is an area where there is a lot of focus on AI to improve the user experience with these types of machines. The desire to keep these platforms viable for a long period of time means that there needs to be a secure way to deliver software updates that enable new capabilities to these platforms. For some of the larger platforms, we also see moves towards greater modularity, enabling hardware upgrades over time. The manufacturers of hardware do not want to be tied into a specific hardware vendor, especially in an area as dynamic as artificial intelligence where there will be many events such as acquisitions, companies ceasing to trade, and changes to performance leadership rankings in the coming few years.

Consolidation

A key consideration in Cobot design, then, is how you consolidate the intersection of information technology (IT) with operational technology (OT). How do you blend the instructions of what the robot is configured to do (IT) with what is happening in real-time in the factory (OT) – especially the rapid and unpredictable actions of humans? In the factory environment, the relatively slow pace of change is understandable since there is a focus on reliability, which translates to increased uptime, reduced accidents, less scrap, and more. OT technology is, therefore, separate networks that get connected up to the IT networks at a main console level. The desire is to push this fusion of IT and OT worlds out on the factory floor so that machines can make more informed decisions, more quickly. 

This challenge has to be addressed in the context of delivering these systems within a cost, power, and footprint envelope that makes them commercially attractive. This implies shrinking multiple systems onto a single consolidated board (and in an increasing set of cases, a single heterogeneous multicore chip). These systems need to run rich operating systems like Linux and Windows while also guaranteeing the real-time behavior of it-simply-must-always-respond-this-way elements of the platform. These systems are referred to as mixed criticality systems. Applications must be compartmentalized to ensure that certain applications cannot cause other elements of the system to fail.

Real-time systems based on a single-core processor (SCP) are well understood in the industry, which has adopted a real-time system engineering process built on the constant worst-case execution time (WCET) assumption. This states that the measured worst-case execution time of a software task when executed alone on a single core remains the same when that task is running together with other tasks. This fundamental assumption is the foundation for the schedulability analysis, determining that a scheduling sequence can be found that completes all the tasks within their deadlines.

In the industry, the technological trend towards adopting multicore processor (MCP) systems is already well established. Many of these multicore systems have been designed with the speed and efficiency requirements of IT applications in mind, and do not always respect the predictability requirements of control systems deployed in Avionics, Automotive, Industrial Automation, etc. In fact, while the assumption of a constant WCET is correct for single-core chips, it is NOT true for current multicore chips, due to interference across cores in accessing shared resources. Interference between cores can cause spikes in worst cases execution times in excess of 6X compared to the case of a single core.

Connected

This issue is compounded by the fact that Cobots necessarily will need to be connected. Not only is there a need to share sensor data and receive instructions, but there is also a need for security and other updates. We need to consider, then, that a security incursion may impact not just the system itself, but potentially any devices accessible from that network connection. Security needs to be baked in from the get-go as opposed to it being an afterthought, and as a lot of attacks have shown, a complex system is only as strong as the weakest link. For systems being deployed for 10+ years, the system architecture must plan for no system to be 100% secure. Instead, there needs to be an early recognition if the system has been compromised, coupled with a way to bring it back to a known good state. And since this is not a static environment, the system must be able to securely download new software to maintain or indeed raise the bar against attack. From a software perspective, the system must be architected to provide guarantees that a system cannot be reconfigured after boot and that no application can accidentally (or maliciously) cause the robot to fail.

Computing

Edge Computing relates to a shift of the intelligence back nearer to where data is created. A simple definition is that the edge is the first device back from where the data is created where multiple streams of data are processed, and insight is derived on that data. This is exactly what Cobots represent. For latency, network availability, and privacy reasons, these Cobots will usually harness local on-premise computing resources as opposed to going to the cloud. To do so successfully in a context where they operate alongside and around humans and potentially undertake tasks with high safety risks, implies that their design needs to be safe and secure. In this article, I have presented six of the ‘C’s that need to be considered: Commercialization of Cobot designs, in a COVID-sensitive workplace, requires Consolidated, Connected platforms that build on Edge Computing concepts.  

Ian Ferguson

About Ian Ferguson

Ian Ferguson is Vice President, Marketing and Strategic Alliances at Lynx Software Technologies. He is responsible for all aspects of the outward facing presence of Lynx Software Technologies to its customer, partner, press and analyst communities. He is also responsible for nurturing our partnership program to accelerate our engagement into automotive, industrial and IT infrastructure markets. Ian spent nearly eleven years at Arm, where he held roles leading teams in vertical marketing, corporate marketing and strategic alliances. Ian is a graduate of Loughborough University (UK) with a BSc in Electrical and Electrical Engineering.

Leave a Reply

Your email address will not be published. Required fields are marked *