Rolls Out Big Data Integration Platform Updates

PinIt’s new enhancements focus on data quality, ease and speed of data delivery, and data analytics.

Web data integration solution provider has launched new capabilities for it’s Web Data Integration (WDI) platform. The new features include faster data extraction, comprehensive analytics capabilities, and a data quality metrics dashboard. They’re all powered by’s WDI Knowledge Engine, which uses the experience gained from billions of website data extractions.

“Web Data Integration is a new philosophy that empowers businesses to take full advantage of web data for mission-critical business purposes. It is defined by high-quality data that is rapidly delivered and integrated directly into business processes with low resource requirements and little to no business risk for the user,” said Gary Read, CEO of “In the same way that SaaS platforms for business intelligence, CRM and hundreds of other disciplines have taken away the need for engineers to write their own solution, is doing the same for the web data market. No more coding, no more bad data, no more worrying about whether the data will arrive on time.”

See also: Is real-time data delivery really that important?

The platform delivers high-quality web data at enterprise scale without the need to write code or monitor quality, making it more cost-effective for organizations than hiring engineering or IT teams. According to a report from Optimas Research, total spend on web data integration is expected to hit $5 billion this year.

According to the company, new features include:

  • WDI Knowledge Engine: has captured the knowledge, experience and technical solutions developed for hundreds of thousands of users, millions of sites and billions of pages and infused that intelligence into the solution. The WDI Knowledge Engine allows web data to be accessed faster and more extensively while providing simplicity for the user, shorter time to market and a higher accuracy of data.
  • Quality Dashboards: This new product capability exposes to the customer direct quality metrics around the data being extracted via an in-built dashboard. IBM estimates that poor-quality data costs businesses in the U.S. more than $3 trillion annually. The new quality dashboards provide a level of transparency so users can quickly realize the quality and value of the data extracted.
  • Data Quality Service Level Agreements: With web data becoming mission-critical, businesses must be able to depend on consistent high-quality data integration. To meet this need now offers its enterprise managed service customers service level agreements that specify data quality measurements. is the first and only provider to offer a Data Quality Service Level agreement.
  • Insights Platform: The platform offers a complete analytics capability to quickly deliver value to customers by offering comprehensive data visualization and notification tools. When combined with’s data extraction and transformation/preparation tools, customers can now go from URL to dashboard in a single solution.

“Corporations are increasingly turning to the web to extract and integrate data into mission-critical decision-making processes, ranging from pricing to risk management, to investment research, to supply chain management,” said Octavio Marenzi, CEO of Opimas LLC. “However, the legacy technologies frequently used are not up to the task, and firms are seeing the need to rely on far more sophisticated Web Data Integration offerings such as”

Sue Walsh

About Sue Walsh

Sue Walsh is News Writer for RTInsights, and a freelance writer and social media manager living in New York City. Her specialties include tech, security and e-commerce. You can follow her on Twitter at @girlfridaygeek.

Leave a Reply