The push to make food and poultry products safer, more wholesome and more plentiful is leading to new
initiatives commonly described as “crop
science” and “precision agriculture.” Although
there are many facets to these initiatives, the
ability to “see” the desired field of view with a
high degree of spectral and spatial resolution
can lead to many scientific breakthroughs
that benefit the global community.
In some cases, the desired field of view can
be an entire crop field or vineyard as seen
from an aircraft or unmanned aerial system
(UAS). In other cases, it can simply be crops
or poultry moving at rapid speed mere feet
away along a conveyor line. Characteristic
of both is motion, which allows a sensor
(either multispectral or hyperspectral) to
capture frames of high-quality spectral image
data that can be analyzed later. A complete
representation is called a hyperspectral data
cube, which is a stack of images of the same
object or scene—essentially an image for each
The practical difference between
multispectral and hyperspectral primarily lies
in the number of spectral bands captured.
Multispectral imaging captures between five
to 30 bands, with gaps between those bands.
Hyperspectral captures hundreds, with very
dense and continual spectral information for
each pixel in the image. In some cases, one
will be more desirable to the other. However,
a hyperspectral sensor gives the option of
“seeing” everything. Spectral signatures are
powerful discriminators, and it’s useful to
know when a particular chemical fingerprint is
there or not. There are many instances where
the granularity and specificity of hyperspectral
image data is absolutely necessary.
Hyperspectral imaging has the unique
ability to extract meaningful scientific
information from the scene or field of view.
It allows users to detect the presence of
a material based on its
spectral fingerprint. It also
allows users to classify
and separate materials
into spectrally similar
quantification are also
hallmarks of the technology.
The ability of
hyperspectral sensors to
exhibit a high degree of
scientists can classify
those disease conditions,
and also build an image
that faithfully pin points
where it is and how invasive it might be.
Since the image data is GPS-coordinated
and orthorectified during post-processing,
the scientific value is significant: irrigation
and fertilizing decisions are more precise,
speciation is more accurate and crops that
might be lost are saved. Indeed, the very same
hyperspectral imaging technology that can
make existing crop harvests more bountiful
can also help survey the land in famine-affected areas so crops can be planted with a
higher degree of success.
Since motion is needed for hyperspectral
imaging to create a data cube, it meshes
perfectly with the rapid growth of the UAS
across industry, research and academia. The
UAS is more affordable, smaller and lighter
than fixed-wing manned aircraft and, thus,
is more readily deployable in unforgiving
areas of the world. Armed with precise
instrumentation, such as hyperspectral
sensors, a UAS can deliver truly life-enriching
information beneficial on a global scale.
Entire economies depend on the success of
agriculture: citrus in Florida; coffee in South
America; vineyards in northern California.
Across them all, UAS with hyperspectral
sensors are deployed at a rapid pace.
One common mistake prospective users
make is misjudging the work needed to
integrate the entire flight/sensor package.
Obviously, size and weight matter because
a UAS, especially the small hand-launched
ones, will typically have a strict payload limit.
Hyperspectral sensor manufacturers, such as
Headwall Photonics, recognize this and are
making their instruments small, light and more
integrated. For example, Nano-Hyperspec
(Figure 1) is only 3 in x 3 in x 4. 7 in and
weighs less than 1.5 lbs. But, in that small space
sits the data storage chip, and the GPS attaches
directly to the sensor rather than connected
by cables that take up space and add weight.
In addition to smaller and more integrated
sensors, Headwall also helps to fully integrate
the flight package. This means not only the
sensor, but also helping with the UAS, the GPS
and, if desired, LiDAR instrumentation. LiDAR
is a common add-on for precision agriculture
work and other remote-sensing research.
The value of this integration work is a
quicker time to deployment and more success
with capturing precise image data. This work
also turns a basic UAV or “drone” into a
Crop Inspection and Grading with
Hyperspectral imaging helps advance agricultural studies.
Figure 1: For airborne precision agriculture applications, the spectral range most
desired is the VNIR, which spans 400 to 1,000 nm. Image: Headwall Photonics Inc.