A framework for selecting sensor data sources in a fast-moving company

In a recent discussion with one of my customers, we’ve dived deeper into the painpoint of selecting the ‘right’ set of sensor data sources. While still keeping up with the pace of seasonal data changes and making sure services to clients could be released just-in-time. In this blog post, I want to introduce to you a framework I’ve used in the past in a more organic way on how to make decisions in such fast-moving environments.

In the world of digital mapping, earth observation (EO) and autonomous sensing, there is a dangerous trap that catches even the most seasoned leaders: the belief that more data analysis and deep research leads to better decisions.

Whether you are evaluating hyperspectral satellite constellations, LiDAR point clouds, or stereo camera feeds for robotics, the instinct is often to go deeper into the technical weeds. Long research cycles are launched and complex ROI spreadsheets are created, but true leadership in a fast-paced environment requires a different approach.

Sound decision-making is not about doing more analysis. It is about controlling commitment.


1. What success looks like

First, let’s define success in a high-growth data-driven environment, a sound data strategy should deliver measurable results across five key pillars:

  • Faster innovation: rapidly moving data from identification to production.
  • Higher quality: achieving measurable improvements in coverage, refresh rates, signal fidelity, and hazard detection.
  • New product unlocks: making data decisions that directly enable new capabilities or open new market segments.
  • Better economics: structuring deals for sustainable, scalable access—paying for value, not volume.
  • Data as a moat: building a strategic partner network that improves the company’s defensibility and uniqueness over time.

2. The strategy of irreversibility: Type 1 vs. Type 2

To move fast without breaking the company’s future strategy, you must categorize every data source evaluation through the lens of reversibility: a framework famously utilized by Amazon.

  • Type 1 (irreversible): decisions that are “one-way doors.” If you get these wrong, the cost to unwind is catastrophic. These include long-term exclusivity contracts, data that forces a complete architectural rewrite, or licensing that forbids derivative products.
  • Type 2 (reversible): decisions that are “two-way doors.” You can test, fail, and pivot within a short window, such as 90 days, with minimal blast radius.

The real mistake is applying Type 1 analysis to Type 2 problems (wasting time) or, more dangerously, applying Type 2 speed to Type 1 problems (creating permanent tech debt).


3. Hard Gates: using “showstoppers” to protect speed

Strong decision-making requires clear boundaries, not just exploration. You protect your team’s focus by identifying “showstoppers” early—non-negotiable criteria that kill a deal before it drains your resources.

Showstopper categoryThe “Walk-Away” trigger
Legal/StrategicNo rights to “learned representations” or derivatives. If you can’t own the intelligence you build on top of the data, you have no moat.
EconomicA cost structure that scales with volume rather than customer value, leading to structural margin compression.
ArchitecturalData that requires “special handling” logic that becomes permanent structural tech debt.

Imagine a high-refresh optical data supplier. If their contract includes a five-year exclusivity clause, it is a Type 1 showstopper. Negotiating this is about protecting the “strategic optionality” to integrate other sensing modalities, like for example Lidar, later.


4. Beyond accuracy: The Three-Layer Framework

When evaluating a new sensing modality, don’t start with the sensor’s precision. Start with the customer outcome to ensure every source has a clear “why-now” and “what-it-powers.”

  1. Customer decision unlocked: does this data actually change a physical action? For example, does a faster refresh rate actually change how a utility trims trees?
  2. Product capability enabled: what can we do now that was impossible before? This could mean moving from simple “detection” to “preventative risk scoring.”
  3. System-level cost & risk: does this data create “logic scars” in our architecture, such as complex cloud-gap handling (in a EO use case for example) that we’ll have to maintain forever?

5. The hidden bar: judgment under uncertainty

The hallmark of a senior data leader is the ability to make irreversible decisions with incomplete information. You will never have all the data on a new LiDAR provider or a startup’s satellite constellation.

Discipline means acknowledging that uncertainty and building in a “kill-switch”. For example: “If X outcome doesn’t happen by Y date, we stop.” That is not a failure of research; it is the height of product maturity. You are managing option value, not a backlog.


The bottom line

If your teams are arguing over data sources, it’s usually because the decision logic is opaque. By defining your Type 1 boundaries and setting hard showstoppers, you move from “extended research projects” to “rapid productionization.”

Stop trying to be right about everything. Start making it impossible to be catastrophically wrong.


Let’s build a defensible data moat

Are you looking for a leader who treats data contracts as product architecture rather than mere procurement? I specialize in navigating the high-cost trade-offs of sensor integration with clear, disciplined decision-making.

Let’s start a conversation. Reach out to me via link to explore how we can turn selecting the right sensor data into a competitive advantage.