SHARE
Facebook X Pinterest WhatsApp

Sixty Years on from the System that Revolutionized Computing, Real-Time Data is Shaking up Mainframe Computing

thumbnail
Sixty Years on from the System that Revolutionized Computing, Real-Time Data is Shaking up Mainframe Computing

Advances in enterprise data management are opening a real-time observability window into the mainframe systems, in the process shining a light on the impact of various applications and use cases.

Written By
thumbnail
Jacky Hofbauer
Jacky Hofbauer
Jun 24, 2024

Almost sixty years ago, on April 7, 1964, IBM released its System/360 family of mainframe computers designed for both commercial and scientific applications. As Wired recognized, IBM System/360 became “the benchmark for mainframe performance for many years” and introduced the 8-bit byte as the default global standard. 

Today, mainframe computers continue to play a vital role in powering businesses in the US and beyond. Thanks to their reliability, security, and scalability, 70% of Fortune 500 companies still rely on these machines for mission-critical operations such as financial transactions, quarterly financial reporting, and inventory management across industries including finance, retail, logistics, and government.

Managing costs in mainframe computing

But mainframe systems are not without their challenges, with cost chief among them. Not only do mainframes cost around 75,000 per unit (USD), compared to just two or three thousand for a server, but there are also ongoing software costs. And as software costs can be based on a variety of factors – including a combination of baseline monthly rates, peak consumption, annual rates based on total consumption, and the number of operations performed – they can be opaque, leaving IT teams bogged down in convoluted fee calculations and struggling to optimize costs.

The technology industry has attempted to solve this issue with solutions to manage and optimize the performance of mainframe systems in line with internal financial and HR constraints. However, the sheer number of operations that rely on mainframe resources means that enterprises have continued to struggle to track the impact of various operational activities. Until now, that is.

See also: Real-time Decisions – On your Mainframe!

Advertisement

Unlocking real-time data insights to drive efficiency

Recently, advances in enterprise data management are opening an observability window in real time into the mainframe systems, in the process shining a light on the impact of various applications and use cases. Modern approaches to enterprise data management enable organizations to pull, enrich, and analyze datasets from various sources — meaning that disparate business functions, including finance, logistics, and marketing, can gain an instant understanding of how specific operations and projects affect the use, performance, and fees of mainframe resources.

These real-time insights prove transformative for businesses that use them. They reveal bottlenecks and systemic inefficiencies, enabling IT to enhance performance. They track how different systems consume mainframe resources and allow IT to manage costs through capping usage and forecasting future demand. They also support regulatory compliance by providing transparent tracking, monitoring, and reporting of critical data.

With real-time insights into mainframe operations, IT teams are better able to optimize their systems for cost and performance efficiencies. It’s an approach that will help ensure that mainframe systems will continue to play a central role in enterprise computing for another 60 years, if not more.

Advertisement

The future of mainframe computing

What does this future look like? Businesses operate in a world where advanced technologies like artificial intelligence (AI) are already making waves and where quantum computing promises further revolutionary transformation to the way enterprises operate. Sustainability is, of course, another major consideration and one with direct relevance to power-hungry technologies. Where does mainframe computing fit in this fast-changing world?

Already, new processors, such as IBM’s Telum Processor, are coming to market that will ensure mainframe computers can handle the massive volumes of data required for AI applications without compromising response time. As the cloud revolution gathers pace, and when, eventually, quantum machines are operationalized, hybrid infrastructure is going to become the de facto standard for most enterprises, and these infrastructures will have the latest generation of mainframe computers at their heart.

This hybrid integration will be made possible by tools that enhance the efficiency and accessibility of data within mainframes by making the mainframe data format (which dates from the 60s) compatible with modern formats like JavaScript Object Notation (JSON). Data from right across the hybrid infrastructure can then be combined in data lakes and exposed for analysis in real time, providing comprehensive insights from across the entire enterprise infrastructure to solve a broad variety of business and economic challenges.

From a sustainability perspective, the same technological advances in system observability that are enabling organizations to control costs and improve performance will also help future-proof the technology. The insights gleaned by analyzing the impact of applications on mainframe resources enable firms to avoid waste by allocating only the bare minimum of resources to meet the needs of each use case. Observability data is just as vital to energy efficiency as it is to cost efficiency.

Advertisement

The workhorse of the enterprise

Mainframes have long served as the workhorse of enterprise IT, and they will continue to play a crucial role in the hybrid systems of the future. The continued relevance and value of the mainframe owes much to new real-time observability data that helps IT teams manage cost and performance, increase efficiency, and unlock promising new insights. It is a virtuous circle where real-time data optimizes mainframe systems so that they can continue to serve as a vital source of business intelligence.

thumbnail
Jacky Hofbauer

Jacky Hofbauer is President & CSO of Zetaly. With over 30 years of work experience, Jacky is a passionate and driven leader in the field of mainframe software development and IT service management. In his current role, he helps customers to observe, manage and optimize the performance of mainframe and other IT ecosystems in line with financial and human resource strategies.

Recommended for you...

The Rise of Autonomous BI: How AI Agents Are Transforming Data Discovery and Analysis
Beyond Procurement: Optimizing Productivity, Consumer Experience with a Holistic Tech Management Strategy
Rishi Kohli
Jan 3, 2026
Real-Time Analytics Enables Emerging Low-Altitude Economy
Digital Twins in 2026: From Digital Replicas to Intelligent, AI-Driven Systems

Featured Resources from Cloud Data Insights

The Difficult Reality of Implementing Zero Trust Networking
Misbah Rehman
Jan 6, 2026
Cloud Evolution 2026: Strategic Imperatives for Chief Data Officers
Why Network Services Need Automation
The Shared Responsibility Model and Its Impact on Your Security Posture
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.