Why PostgreSQL should power GPU databases and analytics platforms


A turbo-charged version of PostgreSQL uses GPUs to massively accelerate what the popular database already does so well.

Within the world of SQL, the lingua franca of data-processing, PostgreSQL leads the field as an open source version. And for good reason. PostgreSQL is one of the most powerful platforms and enterprise-ready open source versions available.

Not only is PostgreSQL the fourth most popular database currently in use, according to DB Engines, its popularity is growing faster than any other database in the top 10, including MongoDB and Elasticsearch.

To list just a few of the more widely used features and benefits of PostgreSQL:

  • It’s SQL compliant, SQL extensible and subquery supported
  • It supports a variety of procedural and scripting languages
  • It is easy to use and easy to administer
  • It has an extensive range of data connectors including ODBC and JDBC
  • It is highly extensible with support for full text research and geo-location support
  • It has multiple NoSQL like features for document (JSON) or key-value based use cases
  • PostgreSQL has a very active user community

To top all of that, PostgreSQL’s popularity is boosted again by the fact that it can directly access third-party data sources like Oracle, MySQL, MongoDB and many others. It is this capability of easily interacting with multiple existing data sources that makes the enterprise firms that have multiple data sources love PostgreSQL.

[ Related: How Open-Source GPUs Could Speed Deep Learning, Big Data Analytics ]

It’s no wonder, then, that many applications today run on PostgreSQL – from CRM solutions like Salesforce to financial accounting applications like LedgerSMB. From Google and VMware to Apple and Skype, PostgreSQL has powerful friends.

All of that is, of course, great, but what many do not know is that there is now a turbo-charged version of PostgreSQL available that uses Graphics Processor Units (GPUs) to massively accelerate what PostgreSQL already does so well.

Why a GPU database running under PostgreSQL is clever idea

Companies have data warehouses and business intelligence stacks that are either built on some version of PostgreSQL or a relational database that uses SQL at its core. When they look to get more from these investments and reap the benefits a GPU database delivers, it makes things a lot easier when the GPU database can integrate easily with an existing investment.

[ Related: GPUs: The Key to Cognitive Computing ]

A lot of time and effort has already been invested in the current status quo, by using a GPU-accelerated database that works with PostgreSQL, these investments can be turbo charged and deliver even more value, but it is essential that the new database offers seamless integration and doesn’t require the overhaul of legacy systems. After all, what’s the point of introducing a GPU database if a lot of time has to be spent reinventing the wheel?

A GPU database that works with PostgreSQL offers several impressive advantages.

  1. Time to Value: Working with a GPU database that already works on PostgreSQL allows a company to get up and running quickly and realise value within hours of deployment. Because PostgreSQL integrates seamlessly with existing technology, it can tap into existing data sources directly.
  2. Visualization Gains: Today many companies are already using powerful visualization tools like Tableau, Micro-Strategy and Microsoft PowerBI. A considerable amount of blood, sweat and tears has been spent creating visualisations that are bespoke to each of their companies and clients, and with a GPU database that supports PostgreSQL, the fruits of this hard work will run far faster than before right out of the box. Results to complex queries can be obtained at lightning speed, without impacting any analytics, and visualisation gains using existing tools.
  3. Cost: Most companies today have dedicated thousands of man-hours to writing code for processing data, as well as spending time and money on setting up their hardware resources. They have also generally developed custom dashboards to visualize these outputs. With a PostgreSQL-compliant GPU database, companies can continue to use their existing code out of the box with the added benefit of GPU speed, completely sidestepping the integration costs typically associated with the adoption of modern technologies.
  4. Data Federation: Working with PostgreSQL, a GPU database can easily connect to a multitude of data sources with a single line of code. There are multiple benefits to bringing different data sources to a single point for processing. The user experience is obviously improved, not to mention a reduction of development effort and cost. On top of all that, you now have one single data lake that incorporates all data, which can help identify and reduce data discrepancy.
  5. Tap Into Existing Skills: PostgreSQL and SQL databases are used almost everywhere and most companies already have the required skill-sets in house. Rather than working on a whole new language where you have to either bring in experts or train your team into a whole new skill set, you can simply make use of what you have. There is less of a learning curve, which often means fewer mistakes, less effort and happier internal and external clients and shareholders.

As you can see, there are many cost, time and speed benefits of having your GPU databases run on PostgreSQL so as companies, specially ones with legacy systems, evaluate which GPU database to choose, they should consider a database working on PostgreSQL.

Richard Heyns

About Richard Heyns

Richard Heyns is CEO of Brytlyt. He was responsible for the initial and ongoing research in bringing processing database operations to General Processing on Graphics Processor Units. He has 15 years of experience working on large business intelligence, big data projects and software development.

Leave a Reply

Your email address will not be published. Required fields are marked *