Sponsored by Kinetica

Using Geospatial Analytics to Optimize New Telecom Services

PinIt
geospatial analytics

Telecom providers like T-Mobile are turning to geospatial analytics to help plan deployments and optimize quality of service and coverage.

Telecommunications providers must take a large number of factors into account when planning service coverage and deployments. They need to accommodate rapidly changing service demands, usage patterns, and the dynamic nature of the regions they cover. The issues will only get more complex as providers support new services like 5G and the data harvesting requirements from the growing number of IoT devices businesses use.

What’s needed is the ability to analyze massive geospatial data sets to help make the best possible, highest ROI, network improvement, and site buildout decisions. One problem preventing this from happening is that the databases commonly used by telecom providers were never designed to handle the location-enriched data needed for modern geospatial analytics.

5G: The perfect example of what’s changed

5G network planning, deployment, and management is a good example highlighting the modern challenges providers face with new services.

[Watch Now]  Real-Time Sensor Data Analysis: Techniques and Technologies

5G operates using higher frequencies compared to previous generation services like 3G and 4G. So, the signal carries over shorter distances. That greatly impacts antenna placement. A provider could put a 3G or 4G antenna on top of a hill or the tallest building and serve thousands or tens of thousands of subscribers. In contrast, a provider must install many more 5G antennas, often deployed on lampposts or the sides of buildings.

Being at a lower height means many more things can obstruct a signal. For example, a dynamic community might have new structures going up constantly, construction to existing buildings, or other activities that can block a signal.

Additionally, demand for 5G services can shift greatly throughout the day. Workers coming into a city for work drive up demand for service during business hours. Home users and apartment dwellers gobble up bandwidth in the evenings streaming videos.

These are all reasons why providers need geospatial analytics.   

See also: Booming Data Volumes and Velocities Require Vectorized Databases for Real-Time Analytics

User example: T-Mobile

That was the case with T-Mobile, which sought to optimize quality of service and coverage with advanced geospatial analytics.

Its existing solutions were not ideal for the new demands. They had analytical limitations in data processing. And T-Mobile did not have the ability to extract the full value of its data or provide sufficient rationale for investment prioritization. 

Like others, T-Mobile used an industry-standard solution that included an open-source PostGIS spatial database extender for a PostgreSQL object-relational database. The solution required continuous optimization that was unsustainable for the variety and ad-hoc nature of the company’s work. It experiences many outage problems.

Desired/required characteristics

T-Mobile needed a new database that could handle the large about of rapidly ingested geospatial and real-time data. Some of the key capabilities it sought in a new solution included the ability to:

  • Analyze massive geospatial data sets of billions of records, add more processing power, and apply complex geospatial and predictive modeling from a single unified database.
  • Acquire geospatial processing (above and beyond what was available with its existing solution) to support rapid network gap analysis.
  • Predict network build or coverage ROI to rationalize spend prioritization decisions.
  • Provide interactive, real-time analysis in a context the business already understands.
  • Use predictive analysis and advanced analytics more broadly to generate deeper network and subscriber insights.

After a thorough and stringent evaluation, T-Mobile selected Kinetica, the database for time and space.

[Watch Now]  Real-Time Sensor Data Analysis: Techniques and Technologies
Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *