Reducing Data Preparation Costs & Increasing ROI with Mobile Location Overlap Analysis

Picture of Suman Joshi

The location data industry is using multiple sources of data with a varying degree of quality. This means suppliers often have a significant amount of overlapping records. While this does not impact quality or validity of the data, it poses a challenge for buyers who need unique values for their analyses.

Most off-the-shelf vendors leave this unaddressed as it makes their data volumes appear larger. However, when buyers prepare the data for analysis by removing overlaps, they often see a significant drop in overall volumes. 

Overlap analysis blog-1

Quadrant, however, is actively identifying overlapping values at the supply level, to determine the value each supplier is delivering. Our data science team has developed a sophisticated overlap analysis model to address this issue. This helps us maintain a high quality data feed while keeping costs down, and qualifying suppliers based on unique data values they are delivering rather than the volume of data – all of which greatly benefits our buyers.

 

What is overlap analysis and why is it needed?  

Overlap analysis is the measurement of the level of commonality within a group of supplier feeds. It can help us gauge the uniqueness and similarities between two or more supplier data feeds and the extent of overlap between them.  

Quadrant has a varied and intricate system of data sources, including our proprietary SDK, which gathers GPS (Global Positioning System) signals from opted-in devices. We wanted to optimise this system for our buyers who face issues with data processing times due to multiple common values and tedious data preparation. 

Most data is priced based on volume which means customers have to pay a higher cost for the same data if there is significant overlap between data feeds. Which, in turn, means lower ROI. This gets even more complicated when new suppliers are added and are likely to have overlapping data in their feeds - data that we might already have or are getting from another supplier. 

Goals

  • Maintaining and providing buyers with a voluminous, high quality, and fast data feed to improve the ROI of their analyses 
  • Eliminating feeds of high overlap, reducing processing efforts and data preparation costs 
  • Pre-determining the nature and quality of feeds to offer optimal customisation for each buyer’s use case and maximise the potential value they get. 

Benefits

 

Maintaining uniqueness of the centralised data feed 

Uniqueness of a data feed is an indicator of values contributed by a specific supplier that are not contributed by any other feed in the data pool. By determining the uniqueness, we are able to maintain a centralised data feed that can correctly reflect actual volumes and help buyers make faster purchase decisions. It also helps buyers avoid extra costs related to preparing the data for analysis. Our final feed also goes through a sophisticated deduplication and noise filtering algorithm, that weeds out invalid and duplicate values, further improving uniqueness.

 

Measuring and delivering high business value 

To measure business value, we need to determine the irreplaceable merit of a data feed compared to the entire data pool. We measure the differences and unique fields of the various supplier feeds to understand how much of an impact adding or removing them has for the overall data pool. For example, a data feed might contain coverage that is valuable for certain regions and use cases, or have values that the other feeds are not providing while having a sizeable percentage of overlapping data. Measuring the impact of a feed helps us make decisions to eliminate or retain it based on the interest and demands of our buyers and their unique business applications. 

 

Managing and normalising costs  

When delivering the data feeds, the costs to process, package and send the data can skyrocket if not planned properly. We need to ensure that we provide the best combination of feeds to our buyers with as many unique devices or events as possible with minimal overlap. Our analysis helps us maintain costs, therefore allowing us to stabilise the price we offer to our buyers. By eliminating or including a feed based on the value it provides for a unique use case, we can further reduce costs; customers only pay for what they really need and get their desired ROI. 


Interested in learning more about overlap analysis and how it can benefit your unique use case and application ?

Contact a location data expert today

 

 

ABOUT AUTHOR

Great updates

Subscribe to our email newsletter today!