PhytoMaps · Farming with Foresight
Field notes
7 min readSatelliteDroneNDVITurf health

Satellite or drone for turf health? Most superintendents get the question wrong.

Resolution, revisit time, cloud cover, and the honest lead time NDVI gives you. A practitioner's framework for when each tool earns its place — with the underlying ESA, USGA and peer-reviewed numbers.

The question comes up early in most conversations. “If a drone gives me centimetre-level imagery, why would I look at satellite at all?” The honest answer is that they are not solving the same problem.

A drone tells you, with great precision, what a specific area looks like on a specific morning. A satellite tells you, with lower precision but no scheduling on your part, how every part of the property has been changing over time. Set the two side by side instead of head to head, and the picture clears up.

Resolution and cadence — the unavoidable trade-off

Sentinel-2 — the European Space Agency constellation that supplies most of the public-source NDVI used in turf software — samples 13 spectral bands. The bands relevant for vegetation health, red at 665 nm and near-infrared at 842 nm, are delivered at 10 m spatial resolution. With Sentinel-2A and Sentinel-2B in orbit, the constellation revisits any point at the equator every 5 days; mid-latitudes get more frequent passes.1

Compare that to a typical multispectral drone — for example a MicaSense RedEdge-class sensor flown at 120 m altitude. Ground sampling distance lands around 5 cm per pixel.2 That is roughly two hundred times more detail per metre of canopy. The catch is obvious: the drone only flies when you fly it.

10 mSentinel-2 NDVI pixel · revisited every 5 days
~5 cmTypical multispectral drone GSD at 120 m AGL

Both numbers matter. Neither is “better” in isolation.

What a 10 m pixel actually shows you

A 10 m pixel is wider than most greens are at the throat. On greens, you are looking at three or four pixels per surface — enough to register a zone trending down, not enough to read the pattern of dollar spot. On fairways, surrounds and tees, 10 m is plenty: a dry shoulder, a thinning approach, a struggling collar all show up as zone-level signals.

This is why most experienced operators end up using satellite as a pointer and drone as a microscope. You spot the fairway-level drift on satellite, then you fly the area that earned a closer look.

NDVI's actual lead time

There is a tendency in this category to oversell early detection. The version that holds up to scrutiny, from peer-reviewed turf research:

A 2019 study published in Agrosystems, Geosciences & Environment found that NDVI from sUAS-mounted sensors detected drought stress in turfgrass about a week before visual symptoms emerged.3 That matches the experience of most superintendents who run the indices alongside their normal walks. A week is not nothing — it is the difference between a syringing cycle and writing a recovery plan.

The same body of literature is clear about the limits. The Alabama Cooperative Extension System notes plainly that NDVI “is not useful for acute stresses such as an irrigation pump failure or plugged irrigation lines that have an effect within days”.4 If a head fails on a Saturday afternoon, the satellite is not going to flag it on Sunday morning. You will find that one the way you always have — by walking the green.

The cloud problem, and what it means in practice

The other honest constraint with satellite is cloud cover. Sentinel-2 only delivers usable data when the sky is clear over the area you care about. Average cloud cover over Europe varies seasonally, with availability thinning through autumn and winter — exactly when many courses in northern latitudes would most like another data point.5

A platform handles this in one of two ways. It either shows you stale data and pretends, or it skips the cloudy passes and shows you the most recent valid reading. Quietly, the second is the only one worth paying for. In a long cloudy stretch you will see fewer updates than usual, and a clean reading the moment one arrives.

A pattern that holds up

The courses we work with that get the most out of remote sensing tend to run something like this:

  • Satellite is on by default. The team checks it once or twice a week, the same way they check the weather.
  • A drone goes up when satellite flags something specific, when a treatment needs verification, or when a stakeholder needs a defensible record of conditions on a date.
  • Seasonal drone scans in spring, summer and autumn anchor the year — the deepest snapshots of the property at the start, middle and end of the playing season.

The point is not that you have to choose. The point is that satellite and drone answer different questions, and the operations getting the most out of either are the ones who have stopped trying to make one tool do the other's job.


References

  1. European Space Agency, Sentinel-2 MSI · Resolution and Swath. sentinel.esa.int
  2. Blue Marble Geo, Using UAV/Drone Imagery to Map a Golf Course (real-world deployment with MicaSense RedEdge multispectral payload at ~5 cm GSD).
  3. Hong, M. et al. (2019). Thermal Imaging Detects Early Drought Stress in Turfgrass Utilizing Small Unmanned Aircraft Systems. Agrosystems, Geosciences & Environment. acsess.onlinelibrary.wiley.com
  4. Alabama Cooperative Extension System, Understanding Vegetation Indices Used in Precision Agriculture. aces.edu
  5. Assessing global Sentinel-2 coverage dynamics and data availability for operational Earth observation applications, International Journal of Digital Earth (2019). tandfonline.com