In our world, every research tool and every technology plays a unique role in helping us explore the mysteries of the Universe. Just as the Hubble Telescope opened our eyes to the previously unseen corners of space, multispectral and hyperspectral imaging allow us to perceive what remains hidden from our senses in everyday life. To understand the difference between multispectral imaging and traditional photography, it’s worth looking at them through the lens of their unique capabilities and applications.

Traditional photography is like a journey back to the moment when light composed of the three primary colors, red, green, and blue—is captured by a camera sensor. It serves as a window into the visible world, into the spectrum of light that our eyes can interpret and transform into images we can understand and admire. Conventional photography perfectly reflects how we perceive our surroundings, representing reality through the familiar RGB color model—Red, Green, and Blue.

However spectral imaging is a much deeper and more versatile journey. It is as if we put on a pair of glasses that let us see the world in ways our ancestors could never have imagined. Unlike traditional photography, multispectral imaging records light across many different spectral bands, from ultraviolet through visible light, all the way to the red edge and infrared. Each of these bands carries unique information about the observed object, revealing details about its structure, chemical composition, and temperature.

Spectral imaging technology has its origins in research carried out since the mid-20th century, when techniques of photogrammetry and remote sensing were being developed. Its first applications focused mainly on observing the Earth’s atmosphere and surface using satellites.

In the 1960s and 1970s, NASA began intensive work on multispectral imaging, which led to the creation of the Landsat program—the longest-running series of Earth observation missions. Landsat satellites capture data across multiple spectral bands, ranging from visible light to near-infrared and thermal infrared wavelengths.

Instruments such as the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS) aboard Landsat 8 use optical filters and diffraction elements to precisely separate incoming light into individual wavelengths. Each recorded band carries unique information about the Earth’s surface—its composition, moisture, temperature, and structure. Data collected by the Landsat program have played a crucial role in monitoring environmental change, deforestation, and land use, forming the foundation of modern remote sensing.

Satellite spectrometers split incoming light into many narrow spectral bands, enabling detailed analysis of the physical and chemical properties of observed surfaces. One notable example of such an instrument is MODIS (Moderate Resolution Imaging Spectroradiometer), mounted on NASA’s Terra and Aqua satellites. MODIS records data across 36 spectral bands, covering wavelengths from visible light to thermal infrared, with a spatial resolution ranging from 250 meters to 1 kilometer. These data form the foundation for global climate studies, ecosystem dynamics monitoring, and the observation of clouds, aerosols, and changes in land and ocean surfaces.

Hyperspectral systems record data in hundreds, and in laboratory applications even thousands, of extremely narrow spectral bands. This allows scientists to obtain exceptionally detailed information about the composition and properties of observed objects. Such high spectral resolution makes it possible to accurately identify materials—both on land and in aquatic environments. Examples of advanced hyperspectral instruments include HICO (Hyperspectral Imager for the Coastal Ocean) and HISUI (Hyperspectral Imager Suite), both mounted aboard the International Space Station (ISS).

HICO enabled detailed analysis of the optical properties of coastal waters, supporting assessments of water quality, studies of phytoplankton blooms, and the monitoring of pollution. In contrast, HISUI provides high-quality spectral data used in environmental analyses and the exploration of natural resources.

Interestingly, even the very first images captured by Landsat satellites were able to distinguish between different types of vegetation and soil by analyzing their spectral characteristics. This breakthrough marked the beginning of what we now call precision agriculture.

Another fascinating application of hyperspectral systems is the monitoring of coral reefs, which allows scientists to assess their health and identify areas at risk of coral bleaching.

Today, multispectral and hyperspectral satellite imaging is used in fields such as agriculture, environmental protection, geology, climate analysis, and spatial planning. By capturing observations across multiple spectral bands, these technologies provide valuable data for scientists, farmers, urban planners, and decision-makers around the world.

These technologies allow us to monitor our planet with unprecedented precision, deepening our understanding of the processes taking place on Earth and supporting informed decisions about environmental protection and natural resource management.

In recent years, drones have gained tremendous popularity, finding applications in diverse fields—from photography and filmmaking to precision agriculture and environmental monitoring. At the same time, autonomous ground vehicles are rapidly evolving and can also be equipped with multispectral cameras, providing valuable data in locations that are difficult to access by other means. Together, these technological innovations are transforming the way we collect and analyze information about our surroundings.

Multispectral imaging using drones and ground vehicles differs significantly from satellite-based imaging, both technically and operationally.

Satellites orbit hundreds of kilometers above the Earth, allowing them to observe vast areas at once. The typical spatial resolution of systems such as Landsat or Sentinel-2 ranges from 10 to 30 meters per pixel, which is sufficient for monitoring landscape-scale processes but not detailed enough to analyze smaller structures.

Drones operate at altitudes ranging from a few dozen to several hundred meters, while ground vehicles work directly at the surface. This allows them to capture images with much higher spatial resolution—sometimes reaching just a few centimeters per pixel.

Such a high level of detail is invaluable in precision agriculture, where accurate information about plant health, soil moisture, and the presence of diseases or pests is essential. Drones equipped with multispectral or hyperspectral cameras enable real-time monitoring of crop fields, combining high spatial resolution with exceptional operational flexibility.

Operational costs and technological accessibility are key factors distinguishing satellite imaging from low-altitude imaging. Building, launching, and maintaining a satellite are extremely costly endeavors, and commercial satellite data can also be expensive to obtain. However, some programs—such as Landsat and Sentinel provide their data free of charge under an open-access model.

In contrast, the purchase and operation of drones and ground vehicles are relatively inexpensive, and these technologies are becoming increasingly accessible to smaller companies, research institutions, and even individual users.

Spectral imaging using drones and ground-based systems offers higher spatial resolution, greater operational flexibility, and lower costs, while satellites provide extensive geographic coverage and the ability to monitor global phenomena over long time periods.

In practice, the two approaches are complementary: satellite data provide spatial context, while drone and ground-based measurements deliver detailed validation and high local resolution. The choice between them depends on the research objectives and the required scale of analysis.

Each of these technologies has its own unique advantages that can be applied across diverse fields—from precision agriculture and environmental protection to urban planning—offering valuable insights that support decision-making and the sustainable management of Earth’s resources.

Spectral cameras are equipped with specialized detector arrays that capture light across several or even dozens of distinct spectral bands. Each detector is responsible for recording light within a specific wavelength range.

In such systems, optical filters are used to transmit only selected portions of the electromagnetic spectrum. These filters may be integrated directly into the detector array (for example, in a mosaic configuration) or placed in front of the camera lens as an interchangeable module or a rotating filter wheel.

During flight, a drone equipped with a multispectral camera captures images in separate spectral bands—typically covering ranges of both visible light (red, green, and blue) and invisible radiation such as near-infrared, mid-infrared, and ultraviolet.

Images from the different bands are then precisely aligned and combined (data fusion) to create a composite multispectral image. This process requires accurate spatial co-registration to ensure that each pixel corresponds to the same point on the Earth’s surface.

Spectral analysis also employs a variety of vegetation indices, which enable quantitative assessment of plant condition and temporal changes. These indices are based on measuring the reflectance of specific portions of the electromagnetic spectrum.

Sunlight is a form of electromagnetic radiation composed of waves of different lengths. Plants use only part of this energy—mainly blue and red light—to carry out photosynthesis, the process in which chlorophyll converts light energy into chemical energy.

Chlorophyll strongly absorbs blue light (around 430–470 nm) and red light (around 640–680 nm) because these wavelengths drive the photosynthetic process.

It absorbs green light (around 520–560 nm) much less efficiently, which is why plants appear green—their leaves reflect this portion of the visible spectrum most strongly.

Just beyond the red region lies the so-called red edge (around 700–740 nm), an area where reflectance rises sharply. This region is extremely sensitive to changes in chlorophyll content, which is why many modern vegetation indices are based on this spectral band.

In the near-infrared range (NIR, around 750–1300 nm), chlorophyll no longer absorbs light, so reflectance is determined primarily by the internal structure of the leaf. Healthy, well-hydrated leaves reflect a large amount of NIR light, while dry or stressed plants reflect much less.

Vegetation indices are among the most powerful tools in modern Earth observation. They combine the simplicity of mathematics with the depth of biological meaning, allowing us to “peek” into the condition of plants without ever touching a leaf.

Each spectral index serves a specific purpose—some focus on chlorophyll, others on leaf structure, water content, or protective pigments. Together, they form a language through which plants “communicate” with researchers by means of the light they reflect.

In this subtle interplay of radiation and biology lies the true beauty of spectral imaging—a science that transforms light into knowledge, where every index becomes a story told by plants about their life, health, and adaptation to the environment.

Today, the use of multispectral analysis is one of the hottest topics among plant scientists, as evidenced by the rapidly growing number of research publications. The study presented below is just one of many fascinating examples showing how this technology is on the verge of becoming a fundamental research tool.

One of the first drones to come factory-integrated with a multispectral camera and without the need for any third-party modifications was developed by DJI. The Phantom 4 Multispectral made this technology easily accessible to users around the world, eliminating the need to build a custom drone platform, select compatible cameras, integrate communication and vision systems, and purchase a separate multispectral sensor. This not only significantly reduced costs but also removed the requirement for highly specialized technical expertise.

Today, however, this model belongs to the past, having been succeeded by the DJI Mavic 3 Multispectral. This advanced drone is equipped with multispectral imaging technology designed for professional applications in precision agriculture, environmental protection, and scientific research. It features five multispectral cameras that capture images in the green (560 nm ± 16 nm), red (650 nm ± 16 nm), red edge (730 nm ± 16 nm), and near-infrared (860 nm ± 26 nm) bands, as well as a 20 MP RGB camera with a 4/3 CMOS sensor. With its RTK (Real-Time Kinematic) positioning system, the drone provides centimeter-level accuracy supported by the GNSS (Global Navigation Satellite System). A maximum flight time of up to 43 minutes allows for extended inspection and monitoring missions, while O3 (OcuSync 3.0) transmission technology ensures a stable connection over distances of up to 15 kilometers. The drone is fully compatible with DJI Terra and the DJI SmartFarm Platform, enabling advanced data analysis and map generation. It also supports API integration, making it easy to connect with various agricultural and environmental management systems.

One of the most noticeable differences between the Phantom 4 Multispectral and its successor, the DJI Mavic 3 Multispectral, is the absence of the blue band in the sensor set of the newer model. At first glance, this may seem surprising, since blue and red light are essential for photosynthesis—the process in which chlorophyll absorbs radiant energy and converts it into chemical energy. In practice, however, DJI’s decision was not driven by technical limitations but by a deliberate effort to tailor the drone to the needs of precision agriculture.

The Phantom 4 Multispectral was equipped with five spectral bands: Blue (450 nm), Green (560 nm), Red (650 nm), Red Edge (730 nm), and Near-Infrared (840 nm). This configuration made it suitable for advanced environmental and scientific analyses, as well as for calculating indices that rely on the blue band—such as EVI and ARVI, which account for atmospheric and aerosol effects. As a result, the Phantom 4 served as a more research-oriented platform, often used in environmental and educational projects.

From a biological perspective, blue light indeed plays an important role. It is strongly absorbed by chlorophyll a and b, and variations in its reflectance can provide insights into plant stress caused by light intensity. In practical drone operations, however, measuring this wavelength range is technically challenging: blue light waves are shorter and more strongly scattered in the atmosphere (Rayleigh scattering), the signal is weaker and more sensitive to lighting conditions, and small drone sensors have a lower signal-to-noise ratio in this region. As a result, blue band data often introduced more noise than useful information, complicating calibration and processing.

For this reason, in the Mavic 3 Multispectral, DJI opted for four spectral bands that offer the highest accuracy and stability under typical field conditions.

A far more expensive option is to use a DJI Matrice drone as a base platform along with the appropriate remote controller and pair it with a high-resolution multispectral camera from a third-party manufacturer. This setup can cost four to five times more than an integrated solution, but it provides significantly higher measurement accuracy. Of course, the base platform does not have to be a DJI drone; alternatives such as the senseFly eBee X or other fixed-wing systems can also be used.

Unfortunately, even with the right equipment for multispectral imaging, specialized software is still required to analyze the data and generate vegetation indices which adds another significant cost. Examples of such software include the widely used Pix4Dfields, as well as Aerobotics, Agisoft, and DJI Terra.

It is definitely worth keeping a close eye on the development of this technology especially considering the rapid progress being made in drones, autonomous ground vehicles, robotics, and multispectral imaging conducted not only via satellites but increasingly through advanced software. These analytical platforms are expected to benefit greatly from advances in artificial intelligence, which will undoubtedly lead to more powerful and insightful analytical tools in the near future.