Snowflake is the only database available to share access across to your data across two of the “big three” of “big cloud.”
Snowflake Computing at the Microsoft Ignite 2018 conference today announced the general availability of its relational database service on the Microsoft Azure cloud.
Previously available only on the Amazon Web Services (AWS) public cloud, Snowflake Computing CEO Bob Muglia says the company is now in the unique position of being the only provider of a database through which organizations can share access to live data across AWS and Azure.
Muglia says Snowflake Computing will be able to provide that capability because of a previously announced Snowflake Data Sharing capability through which both structured and unstructured data can be shared today within a single cloud region. Snowflake will soon extend that multitenant capability to include multiple regions spanning multiple cloud services, says Muglia.
See also: Cloud computing drives uptake in manufacturing IoT
In general, Muglia says the rise of artificial intelligence (AI) is accelerating the rate at which data is being shifted into data warehouses in the cloud. Organizations are starting to realize that for AI to work they need to be able to cost-effectively aggregate massive amounts of data, says Muglia. The issue they are encountering now is that legacy relational databases were never designed to handle petabytes of data, while so-called NoSQL platforms such as Hadoop have proven to be cumbersome to deploy and manage, says Muglia. Snowflake provides access to massive amounts of data using a familiar relational database construct, notes Muglia.
“You need to have your data under control before you can apply advanced analytics,” says Muglia.
Muglia says as more organizations seek to share data and analytics across networks of business partners many of them will look for ways to accomplish that goal without getting locked into a specific cloud service provider. Over time, organizations will increasingly decide what workloads to run where based on latency issues as they look to process transactions and analytics in near real-time, adds Muglia. Most new applications will be processing continuous streams of data versus relying on legacy batch-oriented approaches to analytics. There will still be a need for batch-oriented analytics applications, but Muglia says there’s a definite shift towards real-time computing.
Much of that analytics going forward will be augmented using machine and deep learning algorithms. The need for humans to analyze that data is not going to go away any time soon. But increasing bots accessing various classes of algorithms will surface patterns that warrant further investigation by a human analyst.
Of course, not every vertical industry is moving down this path at the same pace. Muglia notes that media, retail and financial services sector is significantly ahead in terms of embracing augmented analytics. But there’s also been some notable progress in healthcare as of late.
It may take a while for AI, advanced analytics, data streaming, and Big Data to converge. But it’s already apparent that enterprise IT is starting to transform at an accelerated rate. Less clear is what organizations will fail to recognize to what degree a variety of enabling technologies are about to utterly transform the competitive landscape.
Pingback: Microsoft Announce Data Sharing Service for Azure