HANA adds database-as-a-service for public clouds and embedded data virtualization.
SAP at the SAP Sapphire Now 2019 conference this week doubled down its bid to infuse real-time analytics within transaction processing applications.
At the core of the strategy is a multi-model (also known as multi-modal) in-memory database that SAP has been developing for over ten years. At Sapphire, the company announced it is making HANA available via a database-as-a-service offering that will be available on public clouds from Amazon Web Services (AWS), Microsoft and Google.
Because HANA runs in memory, it’s possible to process multiple jobs involving different types of data simultaneously using multicore processors running on either a multitenant cloud service or in an on-premises IT environment.
“We think of HANA as a system of systems,” says Gerrit Kazmaier, senior vice president for SAP HANA & Analytics. “It’s a data fabric.
Virtual Data and Marts
SAP has also embedded a data virtualization capability that makes it possible to connect HANA to data residing outside the system, thereby eliminating the need to always move data into the HANA database. HANA also makes it possible to isolate data sets and their associated schemas within a virtual data mart, also known alternatively as SAP Data Space or a database container. Multiple application services can access a single database container, or each service can be allocated its own database container.
At the core of HANA is a columnar database that has been extended in a way that, in addition to processing analytics, also enables HANA to process transactions as if it was a relational database, or to process spatial data. The ability to infuse analytics processing within transactions makes it possible, for example, to identify transactions that might be fraudulent in real time.
This capability can also reduce the strain that analytics applications put on data warehouses. SAP says there is still a need to run a data warehouse on top of HANA, primarily to analyze HANA data alongside third-party data sources. But as analytics is processed alongside transactions, the need to rely on a data warehouse to process every analytics function is reduced.
SAP also notes that multi-modal databases reduce costs because they eliminate the need to stand up separate stacks of IT infrastructure to support different classes of databases.
Combining transactions and analytics in real time is not exactly a new idea. IBM has been providing similar capabilities on a mainframe for years. The SAP approach, however, is not only less costly because it relies on x86 processors; SAP contends HANA provides a much more efficient approach to achieving that goal at a much lower cost.
SAP chairman Hasso Plattner told Sapphire attendees that there are now more than 50,000 customer licenses for HANA, with one unidentified customer processing 100 million transactions processed daily. Another unidentified customer has 48 TB on single node, while a third has 72 TB on the largest customer on scale-out, says Plattner. SAP expects those numbers to increase dramatically as HANA becomes available on public clouds, especially as third-party developers discover it has become easier to not only access a HANA database, but also rely on SAP to continually maintain and update the database service.
The real challenge, however, may be getting developers to appreciate what can be accomplished using HANA as an alternative to Oracle or open source databases. SAP has a clear need for HANA to infuse real-time analytics into its S/4 suite of ERP applications. But there’s a clear need for SAP to engage third-party developers more aggressively.
In the meantime, if data is the new oil, SAP wants to become the company that provides all the refineries needed to turn data into actionable intelligence, which SAP is trying brand as “the intelligent enterprise.”
Of course, intelligent is a relative term. SAP might be well advised to be a little circumspect about identifying one customer as being more intelligent than another. But whatever the path forward, the one thing that is certain is the days when enterprise IT organizations relied primarily on batch processing are coming to an end.