The Key to Digital Twin Adoption: Think Big, Start Small

PinIt

With digital twin, it’s wise to start small with achievable processes that can make a more immediate impact and then tackle more complex processes later on.

Digital twin has become a buzzword and major investment opportunity of late. In fact, data from ABI Research predicts spending on industrial digital twins is expected to grow from $4.6 billion in 2022 to $33.9 billion in 2030. Major players, including Amazon, Microsoft, and Google, have all recently launched their own digital twin solutions.

Despite these developments, adopting a digital twin solution isn’t always a straightforward process for businesses. While the “holy grail” would be to have a full end-to-end digital picture of the entire business, this isn’t a practical place to start for many companies, especially those starting from scratch.

What do businesses need to consider when adopting a digital twin strategy?

Digital twins can model everything from a single machine to an entire business. As defined by IBM, “A digital twin is a virtual representation of an object or system that spans its lifecycle, is updated from real-time data, and uses simulation, machine learning, and reasoning to help decision-making.”

Imagine a digital representation of an engine, and then the assembly line that makes the engine, the supply chain that supplies the parts, and even the hiring process and staffing model to ensure the right skilled workers are available when and where they are needed to keep the line humming.

In the long run, an end-to-end system promises step-function improvements in how a business operates. For example, many companies do scenario planning. Often this is done on an annual or biannual basis, with manual work by analysts to build a few selected models in spreadsheets, followed by wargaming around a table or in front of a whiteboard.

Contrast that with hundreds or thousands of scenarios being generated constantly, with many decisions automated as some become more or less probable, while algorithmic criteria raise the most impactful for proactive human attention.

The good news is that the entire business does not need to be modeled from day one to realize value from a digital twin strategy. In fact, it’s wise to start small with achievable processes that can make a more immediate impact and then tackle more complex processes later on.

Digitizing workforce management is a great example. Every company has processes to recruit and hire, market to customers, and–in the case of a services business–assign talent to projects. Imagine the status quo is each business unit lead, on a quarterly basis, evaluating their book of business and upcoming marketing campaigns, then inputting their priorities for the roles and skills they need HR to hire into a spreadsheet.

Contrast this with a smart system–a digital twin–that is generating intelligence about what’s likely to be needed skills-wise using data like past results from similar marketing pushes, actual time-to-hire for specific skill sets, attrition, and economic inputs that affect demand. Something as simple as automatically adjusting the employee referral bonus for different roles or changing the priority in the queue for recruiters without the need for human intervention could create a competitive edge.

Whatever part of the business a company lands on as its starting point, they also need to ensure they build a high-fidelity model. The velocity with which data needs to flow into and through digital twins of processes or products can vary: but the bottom line is that life happens in real time.

Businesses should take the initial step of checking if their infrastructure is able to handle and process data in real-time. If it’s not, their digital twin journey is likely to be a short one. Having a solid base is crucial.

See also: Digital Twins Packing a One-Two Punch, Survey Shows

The Implications of Bad Data

What’s the worst-case scenario if a business doesn’t take the above advice and ensure its data has fidelity in the real world?

One potential outcome would be that they undershoot the model. The intricacies of data needed for accuracy varies by what is being modeled. For example, the data that matters for weather is at the quarter-mile scale; data measured from sensors on farming equipment can be down to the quarter inch. These details are important, and leveraging data that isn’t measured correctly for the given scenario could seriously throw off a model – making it ineffective in its purpose as a digital twin.

Businesses must also be aware of introducing data that isn’t important into a model. Without understanding the machine or business process, it will be difficult to determine the important variables from those that may be irrelevant. Throwing in factors that don’t model the real world will hinder the output. For this reason, any digital twin process must start with a thorough audit of the process to understand what is relevant and what is not.

One way to ensure the digital twin is effective and only includes the most relevant and accurate data is to staff cross-functional teams. The line of business responsible for the process or machine being modeled should lead the charge and have data science staff assigned to their team. This will reduce the amount of learning and education needed when starting the digital twin process since the tech team will already be embedded and have a deeper understanding of the business process.

See also: In 2022 and Beyond, the World Will Embrace Digital Twins

Using a Digital Twin to Get Ahead

The shift to a digital twin is already underway. Fragility exposed by the pandemic makes clear that moving the business continuity process to the digital realm can help in preventing major supply chain or staffing issues before they spiral out of control.

And–crises aside–businesses can also reduce waste by creating an automated digital solution for processes they’re already doing manually. Employees could be freed up to leverage insights from the digital twin rather than spending hours pulling together reports that could be almost immediately outdated, thanks to outdated data.

With businesses across industries starting to invest serious dollars into digital twin solutions, it is also becoming a key competitive differentiator. The early adopters will get a head start on reaching the “holy grail” end-to-end business model, while those who wait will be relegated to playing catch up.

The moment for digital twins is here. How businesses embrace the technology will have impacts for years to come.

Bryan Kirschner

About Bryan Kirschner

Bryan Kirschner is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing. Prior to joining DataStax, Bryan was named the first Director of Open Source Strategy in the history of Microsoft and was instrumental in a task force that resulted in then-CEO Steve Ballmer’s 2010 landmark speech declaring that Microsoft was “all in” on cloud computing. In 2013 he founded the Apigee Institute, bringing Apigee customers, world-class experts, and groundbreaking research together to help enterprises accelerate digital transformation. Following the company’s acquisition by Google, he led research and strategy for Google Cloud Developer Relations and DevOps Platform. Bryan holds a Bachelor of Arts in Philosophy from Yale University and lives in Seattle, Washington.

Leave a Reply

Your email address will not be published. Required fields are marked *