SHARE
Facebook X Pinterest WhatsApp

South Big Data Hub Part of New Nationwide Data Storage Network

thumbnail
South Big Data Hub Part of New Nationwide Data Storage Network

The Open Storage Network will enable researchers to manage big data more efficiently than ever before.

Written By
thumbnail
Sue Walsh
Sue Walsh
Jul 31, 2018

The South Big Data Hub announced its selection as one of four regional big data hubs awarded an $1.8 million grant from the National Science Foundation. The company will use this grant for a two-year project to develop a data storage network.

A collaborative team will use its expertise and resources to develop the Open Storage Network. The network will allow academics nationwide to work together and share their data quickly and easily.

Project Leaders & Participants

Alex Szalay of Johns Hopkins University will help this project with the assistance of data storage partners across the United States including the National Data Service and members representing each of the four NSF-funded Big Data Regional Innovation Hubs (BD Hubs):

  • South Big Data Hub at the Renaissance Computing Institute (RENCI)
  • Georgia Institute of Technology
  • West Big Data Hub at the San Diego Supercomputer Center (SDSC)
  • Midwest Big Data Hub at the National Center for Supercomputer Applications (NCSA)
  • Northeast Big Data Hub at the Massachusetts Green High Performance Computing Center (MGHPCC)
  • Pittsburgh Supercomputing Center (PSC)

See also: Managed services looks to plug big data skills gap

“There are hundreds of issues to solve before we have a “datanet” as efficient and cooperatively well organized as the internet,” Christine Kirkpatrick, executive director of the National Data Service and co-chair of the Big Data Hubs’ Data Sharing and Cyberinfrastructure Working Group said. “But just as there are many complexities to solve in the long-term, these solutions can be facilitated by the simplest of additions – here the establishment of a connected storage network. Sharing, reproducibility, replication, and data management all fundamentally rely on a place to store data.”

The data transfer systems for the new data storage network are designed to match the speed of a 100-gigabit connection while being low-cost, large capacity and high bandwidth while utilizing a small number of nodes.

thumbnail
Sue Walsh

Sue Walsh is News Writer for RTInsights, and a freelance writer and social media manager living in New York City. Her specialties include tech, security and e-commerce. You can follow her on Twitter at @girlfridaygeek.

Recommended for you...

The Observability Gap AI Exposed
Tim Gasper
Jan 21, 2026
Data Immediacy’s Next Step
Smart Talk Episode 9: Apache Iceberg and Streaming Data Architectures
Smart Talk Episode 5: Disaggregation of the Observability Stack

Featured Resources from Cloud Data Insights

Bye to the Beta Phase of AI Agents: How to Succeed in 2026
Gastón Milano
Feb 6, 2026
Five Reasons Why DataOps Automation Is Now an Essential Discipline
Keith Belanger
Feb 5, 2026
How Data-Driven Automation Solves the Scalability Challenges of Legacy VDI
Amol Dalvi
Feb 4, 2026
The $5 Trillion Blindspot: When Robots Run Faster Than Your Dashboards
Chris Willis
Feb 2, 2026
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.