Get Our e-AlertsSubmit Manuscript
Plant Phenomics / 2021 / Article

Review Article | Open Access

Volume 2021 |Article ID 9840192 |

Wei Guo, Matthew E. Carroll, Arti Singh, Tyson L. Swetnam, Nirav Merchant, Soumik Sarkar, Asheesh K. Singh, Baskar Ganapathysubramanian, "UAS-Based Plant Phenotyping for Research and Breeding Applications", Plant Phenomics, vol. 2021, Article ID 9840192, 21 pages, 2021.

UAS-Based Plant Phenotyping for Research and Breeding Applications

Received21 Feb 2021
Accepted29 Apr 2021
Published10 Jun 2021


Unmanned aircraft system (UAS) is a particularly powerful tool for plant phenotyping, due to reasonable cost of procurement and deployment, ease and flexibility for control and operation, ability to reconfigure sensor payloads to diversify sensing, and the ability to seamlessly fit into a larger connected phenotyping network. These advantages have expanded the use of UAS-based plant phenotyping approach in research and breeding applications. This paper reviews the state of the art in the deployment, collection, curation, storage, and analysis of data from UAS-based phenotyping platforms. We discuss pressing technical challenges, identify future trends in UAS-based phenotyping that the plant research community should be aware of, and pinpoint key plant science and agronomic questions that can be resolved with the next generation of UAS-based imaging modalities and associated data analysis pipelines. This review provides a broad account of the state of the art in UAS-based phenotyping to reduce the barrier to entry to plant science practitioners interested in deploying this imaging modality for phenotyping in plant breeding and research areas.

1. Introduction

Most air vehicles used for plant phenotyping are based on the concept of a remotely piloted aircraft system (RPAS) as defined by the International Civil Aviation Organization (ICAO). There are, however, a diversity of names and nomenclature for these devices depending on country of use, with drones, unmanned air vehicle (UAV), and unmanned aircraft system (UAS) being commonly used terms. In order to avoid ambiguity, we choose to call these systems as UAS, with the definition used by the United States Federal Aviation Administration (FAA): “an unmanned aircraft (an aircraft that is operated without the possibility of direct human intervention from within or on the aircraft) and associated elements (including communication links and the components that control the unmanned aircraft) that are required for the pilot in command to operate safely and efficiently in the national airspace system.” (Public Law 112-95, Section 331 (8-9) United States) [1].

There are several technical decisions that the practitioner has to make to ensure that the UAS operation and subsequent postprocessing analysis produce actionable information from the plant science perspective. The key decisions a practitioner needs to make include: Which UAV and sensor package should one choose? What are the critical steps to successful deployment, and steps to successful processing of the data? What has been done in this scientific discipline? What are current state-of-the-art applications of UAS in plant phenotyping? Where are we headed next? What are the open questions, trends, and challenges [2]? This paper reviews the state-of-the-art in UAS deployment, data collection, curation, storage, and analyses, discusses pressing technical challenges, and identifies future trends in this arena. The intent of this review paper is to provide an overview of the state of the art in aerial-based analytics to plant sciences and breeding practitioners who are interested in deploying this phenotyping modality matching their specific phenotyping needs. For complementary reading on UAS phenotyping topics not directly covered in our work, readers can refer to additional review articles [3, 4].

2. UAS Types and Imaging Modalities

2.1. Types and/or Classes of UAV

We first provide a taxonomy of UAV devices through the lens of plant phenotyping. While UAV can be classified based on a diverse set of features [47], in the context of a plant sciences/breeding practitioner, it is useful to simply classify them according to their physical features or configuration. UAV are classified into the following: single-rotor, multirotor, fixed-wing, and hybrid VTOL (vertical takeoff and landing) fixed-wing. Table 1 provides a concise overview of these types of UAV. Prices are not listed, as it can vary substantially depending on the manufacturer and country that they are bought in; however, for comparison purposes, a price range is included with information from the US.(i)Single-rotor UAV (also called helicopter) is a device that can be powered by either combustion (i.e., liquid fuel) or an electric motor, resulting in substantially longer flight times and higher payload. The earliest applications of UAV to plant phenotyping used these kinds of devices [8]. While providing reliability and flexibility along with larger payload capacity, the major disadvantage of such single-rotor unmanned helicopters remains their cost as well as the ensuing complexity of operation and maintenance(ii)Multirotor UAVs are currently the most popular UAV devices, primarily due to their ease of operation, low cost, and reasonable payloads. These devices have seen wide usage in a variety of applications including agricultural monitoring and industrial inspection. A major disadvantage of multirotor UAV is their limited endurance and speed, which creates difficulties for long runtime phenotyping. These limitations are a challenge in plant research where a large tract of field and experiments may need to be phenotyped. While this issue can be mitigated to an extent by the use of spare batteries, the problem requires considerations of battery energy density, weight, and cost. This is an active (and rapidly progressing) area of research [9] with several potential possibilities including (a) moving away from lithium ion batteries, (b) swapping UAV and wireless power transfer [10, 11], and (c) mobile charging stations [12](iii)Fixed-wing UAVs provide an advantage over multirotor as these units can cover larger areas due to expanded flight time and speed; however, they require a “runway” for takeoff and landing. Additionally, this UAV type cannot hover over one spot, precluding detailed observations in specific cases where such functionality is needed, i.e., immobile measurement and tracking. The fixed-wing UAV can hold larger payloads, allowing multiple sensors to make simultaneous (and coregistered) measurements, thus increasing phenotyping capability. Fixed-wing UAVs generally fly at higher speeds than multirotor UAV, so some care has to be used to ensure that the capture rate of the sensor matches the UAV speed(iv)Vertical takeoff and landing (VTOL) UAVs are hybrid multirotor and fixed-wing UAV with capabilities to hover like a helicopter (for takeoff and landing, but not for phenotyping), high cruising speed, multifunctionality and versatility, and improved protection of sensors compared to fixed-wing UAV at takeoff and landing. Since this is a relatively new technology in civilian applications, the cost is prohibitive and the barrier to entry remains high for current practitioners in the plant science phenotyping

Payload (kg)Flight time (minutes)OperabilityPrice rangeAbility to hover

Single-rotor (helicopter)16-3250-100DifficultHigh (for sprayer drones)Yes
VTOL fixed-wing<0.860MediumHighYes (for takeoff)

2.2. Open (Maker) vs. Commercial Types of UAS

A UAS system typically consists of the hardware (i.e., the actual physical system) and the control software (i.e., the programs that help run the hardware safely). With the advent of the maker movement, there are now two viable approaches to procuring a UAS. One approach is to buy an off-the-shelf UAS, while the other approach is to use open-source information to create and operate a UAS, where both hardware (via 3D printing) and control software (via open-source repositories) are starting to become available to prototype and deploy UAS that are tailored for a specific phenotyping application. For most beginning and intermediate users of UAS for plant phenotyping, especially for research and breeding applications, commercial UAS systems that provide an all-in-one package to rapidly sense, curate, and act on their field plots provide minimal barrier to entry. However, a clear understanding of the user needs and what the commercial UAS system can deliver is required for safe and hassle-free operation. Also, these systems are not generally customizable if such a need arises during the course of the experiments.

The primary technical difference between the two approaches deals with access to the control and command software. In commercial-type UAS, the flight control system is provided as proprietary software (usually as embedded firmware), which is integrated into the hardware. The end-user usually cannot access or make any changes to the source code. An important consequence of such control software code is the guarantee of performance and technical support during operation. In contrast, open-source flight control systems provide the source code to allow users to modify and integrate into their UAS. Most commercial manufacturers provide (finite time) guarantees of performance and also provide reasonable technical support to troubleshoot issues during phenotyping. In contrast, open-source codes are usually provided “as is”, with no expectation of performance or downstream technical support. Some examples of open source and paid for software for plot extraction and trait calculations can be found in the following references [1315]. A GitHub repository associated with this paper for breeder-friendly UAS plant phenotyping can be found here [16].

2.3. Technical Considerations before UAS Deployment

There are multiple technical decisions that a practitioner must make to identify the most viable UAS for their specific application in plant phenotyping at research scale.

2.3.1. Navigation and Geotagging

The control software needs accurate positioning to produce good geotagged data of the observed field. Geotagged data is essential for correlating genotype/management specifics in the field with the images/measurements made by the UAS. Most commercial UASs have dual global navigation satellite systems (GNSS) such as global positioning system (GPS) and GLONASS. These dual satellite navigation systems have an accuracy of about 2 meters (horizontal). This resolution may not be sufficient for some phenotyping applications, where accuracy in both horizontal and vertical direction in the 1-2 cm range is desired, and this can be achieved by integrating differential correction functions with the GNSS such as real-time kinematics (RTK). If the intent is to create an orthomosaic (stitched images) for the entire experiment or field, relative reference and GCP can be used without an RTK level accuracy. However, if individual images are analyzed and high resolution is required, RTK may be desirable.

2.3.2. Weight vs. Battery

Increasing the weight of the UAS is useful for stability (reduced buffeting with wind shear) as well as improved payload carrying capacity. However, increased weight substantially reduces the total flight time of a UAS due to rapidly draining battery as more power is required to keep the UAS afloat and for maneuvering. Most current commercial multimotor UAS can only fly up to 30 minutes, depending on the sensor payload, which may not be enough to cover large experiments/fields. Therefore, if flying large experiment fields, batteries are swapped between flights and/or multiple UASs are operated in tandem. For smaller sized programs, this is not an important constraint.

2.3.3. Multiple UAS Operating in Tandem

UAS-based imaging can enable the 3D reconstruction/mapping of the complete experiment/field, because images of a single location are taken from different perspectives allowing 3D reconstruction using structure from motion (SfM) [17]. However, unless imaging from multiple perspectives is done very rapidly, the effect of wind can be fairly significant in reshaping the canopy (effectively changing features via occlusion, bending, etc.). One way to circumvent this challenge is to take multiview images simultaneously, which can be accomplished by operating multiple UAS in tandem. We foresee several promising tools and frameworks becoming available to the plant science community that could take advantage of higher quality 3D point clouds that are generated from deploying multiple UAS in tandem [1823].

2.3.4. Policy Challenges Using UAS

There is no standard policy for operating UAS, with variations even within each country. This is understandable as the UAS ecosystem is rapidly evolving. It is important for a scientist/breeder to check and conform to both national and local regulations before deploying the UAS [2426].

2.4. UAS-Based Imaging Modalities

Aerial imaging includes plant, field, farm, and country scales using different systems from drones to satellites (Figure 1). For this article, we primarily focus on plant and field scales.

2.4.1. RGB Digital Camera

The most commonly used imaging system in UAS is an RGB (red, green, and blue) digital camera. They are particularly attractive due to their low cost, low weight, and high resolution. Additionally, due to their similarity to the electromagnetic spectrum over which the human eye operates, RGB camera-based UAS image data has been successfully used for automated phenotyping of features that have traditionally been manually performed. Examples of morphological traits include height, leaf area, shape, organ detection and counting, plant density estimation, and plant/weeds discrimination, among others [2745]. Most popular UAS systems are integrated with a RGB camera system, thus allowing real-time image preview, seamless camera configuration management, and simple remote trigger control by the operator. Due to the tight hardware integration with the UAS system, the RGB images collected are geotagged with onboard GPS data. This minimizes subsequent downstream problems with georegistration.

2.4.2. Multispectral Camera

Cameras that can image at a small number (usually between 3 and 10) of wavebands of the electromagnetic spectrum are called multispectral cameras. From the plant science perspective, cameras that measure red, green, and blue bands, along with measurements of the near-infrared and red edge bands, have been widely used. This is because the reflectance of chloroplast has a peak in the near-infrared band (around 850 μm) and changes rapidly at the red edge (around 700 μm) band. Thus, by combining these bands, one can measure various vegetation indices [46, 47]. More recently, multispectral cameras with dynamically selectable bands have become available. These systems are particularly promising for capturing different phenotypes that exhibit differing signatures at different wavelengths. Recent work has shown that carefully selected multispectral bands in conjunction with sophisticated machine learning (ML) tools can result in sensitive approaches to early detection of a variety of plant traits, including stress signatures [48].

2.4.3. Hyperspectral (HS) Camera

Cameras that can image across a large number of wavebands of the electromagnetic spectrum are called hyperspectral cameras. Hyperspectral cameras have traditionally been used at two scales: (a) on the single plant scale or (b) at the field scale. HS cameras provide significant advantages over other imaging modalities due to a wider electromagnetic spectrum coverage enabling more diverse trait measurements. HS camera can provide physiologically meaningful information about the biophysical and biochemical properties of crop species, as well as detection of biotic and abiotic stresses [49, 50]. A recent development in HS cameras includes the commercialization of “snapshot” HS cameras where all bands are simultaneously captured; however, it is a developing technology in plant science applications. The availability of HS cameras that can be reliably deploy onto a UAS is expected to complement high-throughput phenotyping, as they have the capability of not only providing HS information, but potentially can be used to create 3D point cloud data across each registered spectral band. However, the current challenges to deploying HS camera payloads include (a) low spatial resolution, or rather low spatial-spectral resolution trade-off; (b) high power requirements; (c) calibration, especially for field deployment under varying illuminations; and (d) downstream data analytics to extract useful traits. These are areas of very active research, with viable solutions on the horizon [5157].

2.4.4. Thermal Camera

Thermographic imaging measures the infrared part of the electromagnetic spectrum from an object. This is physiologically important because healthy plants (specifically leaves) emit radiation in the infrared part of the spectrum. Various abiotic and biotic stresses can be indirectly related to the infrared emission signature of the canopy. This is because stresses (heat, drought, and biotic) can result in altered rates of photosynthesis and transpiration, thus affecting the canopy temperature and hence the thermal signature. Therefore, thermal imaging can be a high-throughput approach to evaluating the physiological state of the plant. However, deploying thermal cameras on UAS has seen limited usage due to difficulties including hardware integration, cost of the camera, low frame rate capture, and resolution compared to RGB cameras. Additionally, the thermal image of the field is influenced by the surroundings (presence of roads, water bodies, and buildings) and thus requires calibration. As a consequence, the use of thermal cameras deployed on UAS has seen fewer successful applications in field-based plant phenotyping than RGB imaging [5861].

2.4.5. LiDAR (Light Detection and Ranging)

Although earlier use of LiDAR-based systems used planes or ground-based systems, the reduction in size and weight of LiDAR instruments makes it usable on UAS with appropriate data analytics pipelines. Since LiDAR uses lasers to create dense 3D point clouds, it can provide a more detailed information than what is achievable from SfM or other methods using regular digital or multispectral cameras [62]. Furthermore, LiDAR is amenable for time series tracking of object or plant organ geometries [63]. UAS-mounted LiDAR-based phenotyping has been used for the estimation of canopy biomass and plant height, for example, canopy height in winter wheat to the effect on nitrogen fertilizer rates [64], sugarcane biomass estimation [65], and maize height tracking in lodged plots [66]. The current challenges with routine utilization of LiDAR on UAS are the cost vs. quality trade-off of data [67]. Additional challenges include data processing standardization and the large size of the data. LiDAR is still an emerging technology for use on UAS, and with further research, its usefulness may increase to phenotype additional traits. An in-depth review of LiDAR for plant phenotyping uses was provided by [68].

Table 2 lays out the main types of sensors used as UAV payload. The cost, weight, resolution, and ease of use are presented in categories rather than numbers, because there are a wide range of sensors within each category with varying parameters.

# of bands (commonly available)Commonly covered spectrumCostWeightResolution (megapixel)Ease of use

RGB3450-750 nmLowLow-mediumLow-highEasy
Multispectral3-10450-1000 nmMediumLow-mediumMediumMedium
Hyperspectral>10450-1000 nmHighHighLowDifficult
Thermal13500-7500 nmMediumLowLowMedium
LiDAR1905 nmMedium-highMedium-highMedium-highDifficult

LiDAR resolution is not in megapixels but in point cloud density. There are some multiband LiDAR systems, but they are not routine for UAS.
2.5. Open Technical Challenges with Payload Integration

A promising approach in recent high-throughput phenotyping experiments has been to simultaneously deploy multiple imaging modalities. The motivation here is to simultaneously extract complementary traits using different modalities (RGB+thermal, for instance). However, there are significant technical challenges, e.g., coregistering and combined image analysis, that have to be resolved before this becomes the standard. Challenges span the range from deployment to analysis and include (i) remote and simultaneous triggering of multiple separately mounted imaging systems, (ii) geotagging multiple image data streams, (iii) image coregistering/alignment between cameras and between bands, and (iv) mismatch in image resolution across cameras and associated signal-to-noise ratios. Resolutions to these challenges are active areas of research [6971]; for example, these include the use of structure from motion (SfM) tools to create georeferenced orthomosaic image for each band followed by overlaying of distinct band information based on the geoinformation. A maintained list of platform agnostic SfM software is available at [72].

3. Preprocessing and Data Preparation

3.1. Ground Control Points (GCP)

Deploying UAS usually involves flight planning to ensure that data generated can be registered. The key steps involve the preparation and placement of ground control points (GCP) and way point selection. Ground control points are the visible marked targets placed on the surface of the observation field that are used to geocalibrate the UAS-based images. These targets are placed at locations that are premeasured by high precision GNSS (e.g., RTK-GNSS) and are associated with high-precision coordinates. Availability of these GCPs greatly increases the geometric accuracy of UAS-based mapping [38, 7380]. Specifically, the presence of GCPs provides the capability to correct the latitude and longitude of all points (i.e., all collected images) to accurate GPS coordinates. This is critical to subsequently associate extracted traits with plot level information (for instance, locating and curating data across microplots from different observation dates).

3.1.1. GCP Types

As visual targets that must be easily captured by the onboard imaging systems, GCPs must ideally (a) be clear and visible from the heights the UAS is being deployed and (b) have precisely measured GPS coordinates. There is no set standard for GCPs; however, the most common GCPs include rigid boards painted with an “X” shape marker, a checkerboard texture, or a circular target with a center marking. There are broadly two approaches to deploying GCPs for UAS deployment—temporary versus permanent. In the temporary approach, one places and calibrates the GCPs for each UAS flight campaign. The advantage of this approach is that there are no concerns about the material quality and robustness of the GCP, but the disadvantage is the time and effort needed to place and calibrate GCPs for every flight survey. Additionally, one can potentially change the location of the GCPs for every flight survey according to development stage and imaging conditions (for example, for pre- versus postcanopy closure). In contrast, in the permanent approach, the GCPs are fixed for the entire growing season. The advantage here is that the GCP placement is a one-time resource investment. Care has to be taken to identify locations of GCP placement so as not to hinder crop management practices while providing visual access to GCPs across the growing season. Additionally, the GCPs must be robust enough to withstand natural weather variability. Finally, there are emerging technological solutions that provide built-in high-precision GPS capability within each GCP [81]. High-precision and easy-to-use smart GCPs with built-in GPS may become more common in the near future.

3.1.2. GCP Placements

The number and spatial distribution of GCPs affect the accuracy of mapping of the image data. Thus, increasing the number of GCPs and evenly distributing them over the imaging area is a possibility. However, as described earlier, obtaining good GCPs for large fields can be time-consuming and laborious. There are several recent studies that seek to identify the optimal number and spatial distribution of GCP placement [73, 78, 79]. For plant breeding applications that demand accurate extraction of microplots via high quality 3D mapping, at least 5 GCPs may suffice with four of them located at each corner and one located in the center of the observation field [79]. In plant breeding application, one GCP for every 200 m2 is generally appropriate. If a practitioner is trying to determine the optimum number of GCPs, they can refer to [77], and the GUI developed for easy use [82]. There are other options to GCPs, and we leave it to the practitioner to decide which method works best for them and fits within their budget. The uses of ground control points (GCPs) vs. real-time kitematic (RTK) vs. postprocessed kinematic (PPK) are common techniques for generating accurate UAS data products.

3.2. Way Point Selection and Placement

It is usually ideal to break up the flight path of the UAS into distinct “legs” of traversal (flight), with clearly identified start and end points in space (locations and height). These points are called way points, and the strategy of the UAS following a sequence of way points is called way point routing. Among other advantages, such way point routing ensures that the flight mission is repeatable, safe, and accurate. The predesigned way points record the GPS and inertial measurement unit (IMU) data, as well as camera action commands; thus, ensuring that the UAS follows the predesigned flight automatically. There are various software tools available for way pointing that abstract out the complexities via easy-to-use graphical user interfaces. The software is able to generate these points by the user entering the camera parameters, such as focal length and sensor width, and then inputting the flight altitude or desired GSD. A partial list of such software is listed in Table 3.

Software nameSupported UASManufacturer or 3rd partyCostNoteMapping function integratedWebsite

Aerobotics flight planner towerAutopilot board3rd partyFreeDev is not active now. Works for Pixhawk seriesNo[83]
AltizureDJI3rd partyFreeProvides 3D product visualization platformYes[84]
Autopilot for DJI dronesDJI3rd party$Provides flight recorderNo[85]
DJI GS ProDJIManufacturerFreeNeeds to pay for additional functionalitiesNo[86]
Drone Harmony Mission PlannerDJI3rd party$Provides full 3D intuitive interfaceYes[87]
DroneDeployDJI3rd partyFreeNeeds to pay for additional function; provide live mapYes[88]
eMotionsenseFlyManufacturer$Needs basic knowledge of UAS to connect with UAS; need to work with the manufacturer UASNo[89]
Intel® Mission Control SoftwareIntel® Falcon™ 8+ UASManufacturer$Needs basic knowledge of UAS to connect with UAS; functions only with the manufacturer of UASNo[90]
Litchi for DJIDJI3rd party$Needs additional mission plannerNo[91]
Map Pilot for DJIDJI3rd party$Needs to pay for additional functionalityYes[92]
mdCockpit appMicrodronesManufacturerFreeNeeds basic knowledge of UAS to connect with UAS; functions only with manufacturer UASNo[93]
Mission PlannerAutopilot board3rd partyFreeNeeds basic knowledge of autopilot board, specifically (i.e., Pixhawk series) with Ardupilot or Px4 (or any other autopilot that communicates using the MAVLink protocol)No[94]
Pix4DcaptureDJI; Parrot; Yuneec3rd partyFreeSupports upload to Pix4d cloudYes[95]
QGroundControlAutopilot board3rd partyFreeNeeds basic knowledge of autopilot board (i.e., Pixhawk series) with Ardupilot or Px4 (or any other autopilot that communicates using the MAVLink protocol)No[96]
UgCSDJI; autopilot board3rd party$Needs basic knowledge of UAS to connect with UAS (i.e., Pixhawk series) with Ardupilot or Px4; Yuneec; MikroKopter; MicroPilot; Microdrones; Lockheed MartinYes[97]

Some practical considerations while selecting way points include considerations of the desired spatial resolution and quality of the 3D mapping. The spatial resolution is related to the flight altitude and camera characteristics and must be carefully considered for individual phenotyping exercises. For educational purposes, given the flight altitude [m], camera CMOS size [m], corresponding pixel number [pixel], and focal length of camera [m], we can calculate spatial resolution as . The quality of the 3D mapping requires that the images captured by the UAS enjoy high overlaps between images [73, 77, 98100]. However, higher overlap increases the flight duration significantly thus limiting coverage. For dense vegetation and fields, it is recommended to have at least 85% frontal and 70% side overlap for ensuring good 3D mapping [101]. For easy-to-use calculations and estimations of flight time, we refer to the online mission planner tool [102].

3.3. Calibration
3.3.1. Color Calibration: Approaches and Challenges

Most popular UAS with built-in imaging unit comes with an RGB color camera, although researchers also use specialized cameras with triband including the near infrared, particularly when estimating vegetation indices. While RGB cameras provide high-resolution images of the observation fields, variation in illumination as well as differences in camera hardware can result in the same scene being captured with slightly different colors. This calls for color calibration, which is a process of adjusting the pixel color values in images to a consistent value. Color calibration is especially important if the phenotype of interest is evaluated based on color. This is the case for most plant stress detection and quantification; for example, iron deficiency chlorosis (IDC) in soybean evaluation of symptoms is based on the extent of chlorosis (yellowing) and necrosis (browning) [103, 104]. Additionally, any comparative assessment between images from multiple UAS imaging times requires color calibration.

In field imaging via UAS, there are several factors that affect pixel data including illumination intensity, angle of the incoming light resource, spectral reflectance of the objects, relative position of the camera to the objects, and camera optical characteristics [105]. A common color calibration approach is to place a physical color calibration chart in the field, so that the UAS can concurrently collect data while imaging the calibration chart. This allows postflight standardization of the images based on the image characteristics of the color chart captured by the UAS [106]. However, even with the color calibration, care has to be taken into account for camera-specific variabilities (such as gamma correction [107109]). Another physical constraint is that not every aerial shot can contain the calibration chart. A common assumption made is that the imaging configuration remains constant for the period that the aerial shots do not include the calibration chart. In this situation, the RGB digital cameras deployed on UAS can be used to extract morphological traits like height, shape, area, and counts instead of-color related traits that require parsing out subtle differences between genotypes.

3.3.2. Spectra Calibration: Approaches and Challenges

When using multi- or hyperspectral cameras on UAS, sensor calibration is usually carried out to ensure that each pixel faithfully captures the data across the full spectral bands, thus producing reliable reflectance datacubes. In general, for agricultural research and breeding applications, calibrated reflectance datacubes provide sufficient information for subsequent physiologically meaningful analysis and decision support. Calibration of the camera is a complicated procedure that is usually taken care of by the manufacturer; see [110]. The conversion of the calibrated camera recordings to reflectance values is usually performed by using reflectance reference targets on the field. These reflectance targets have known spectral reflectance and are used to transform the camera readings into calibrated reflectance values [4, 111115]. The standard approach to process this data is called the empirical line method (ELM). Recent work has suggested that ELM-based reflectance computing is suitable for flights under 30 minutes with stable weather conditions [116]. Care has to be taken to ensure that no significant illumination changes occur within each flight.

3.4. Software and Cyberinfrastructure

UAS campaigns can amass large amounts of data fairly quickly. Therefore, having a well-defined data management strategy that facilitates multiple analysis workflows and subsequent integration of output data with decision support systems is essential (Figure 2).

The landscape of service providers that offer turnkey solutions is evolving rapidly (Table 4); at the same time, academic groups are producing ready-to-use open-source analysis workflows powered by deep learning methods [117]. Having a responsive cyberinfrastructure that can effectively leverage both commercial and academic offerings, while scaling (up and down) as the needs of the project evolve is paramount. Supported research cyberinfrastructures (in the US), like NSF CyVerse [118], XSEDE [119], and OpenScienceGrid [120], support the processing and hosting of nationally funded US-based research. Commercial cloud-based turnkey solutions for UAS data management, analysis, and team-based collaboration provide easy-to-use integrated viewers, applications, and app stores (Table 4). Many of these offerings have limits on allowable storage per tier and may not be ideal for a large long-term archival storage. Commercial cloud providers (for example, AWS, Google, and Azure) provide services for managing data through tiered storage and lifecycle management (highly redundant to slower long-term archival). This allows data to migrate from various tiers in an automated and cost-effective manner, and these capabilities can complement local IT resources, when feasible [121123]. However, institutions may have restrictions on the use of some services and platforms, and this needs to be determined at the planning stage of experiments.

SoftwareParentCommercial vs. openWebsite

3D Zephyr3D Flow$[127]
Drone2MapESRI Inc.$[128]
DroneDeployDroneDeploy Inc.$[129]
Farmers EdgeFarmers Edge Inc.$[130]
FlytBaseFlytBase Inc.$[131]
MetashapeAgisoft LLC$[132]
OneDroneCloudTerra Imaging LLC$[133]
Pix4DPix4D Inc.$[136]
Remote ExpertDroneMapper$[138]

3.4.1. Software

UAS-related scientific software can be broken down into categories: (a) UAS flight control and sensor orchestration (see earlier section), (b) passive sensor (i.e., imagery) image processing and analysis, (c) active sensor (i.e., LiDAR) processing and analysis, (d) statistical and analytical GIS, and (e) data management and collaboration. In general, financially expensive solutions involve complete integration of the UAS, sensors, and analytical image analysis pipelines via cloud processing services. These software can be open-source or commercial. Open-source software solutions are more granular, offering components of the UAS analysis pipeline with varying levels of integration and interoperability.

(1) Open-Source Software. The OpenDroneMap (ODM, [124]) project supports an open “ecosystem of solutions for collecting, processing, analyzing and displaying aerial data; and to build strong, self-sustaining communities around them.” OpenDroneMap includes a stand-alone program, web interface, API, and connectivity to multinode cloud processing options. ODM data can be uploaded to the OpenAerialMap.

(2) Commercial Software. The UAS surveying industry for civil infrastructure is the most lucrative and largest sector for software development. Many software are packaged as part of UAS surveying ecosystems (Table 4). Example solutions include SenseFly ([125]) and ESRI Drone2Map. These have partnered with Pix4D (Pix4Dfields, [126]) and DroneDeploy, respectively. Other example software for image processing and SfM with multiview stereo (SfM-MVS) photogrammetry includes Agisoft Metashape. Most commercial software (e.g., Pix4D and Agisoft) can be run on bare metal or cloud infrastructure, in single-node or multinode configurations.

3.4.2. Database Management Strategies

UAS campaign data is typically acquired on removable flash-based memory cards and often transferred to field-based laptops that are synchronized to more permanent storage resources such as file servers and cloud. Maintaining a catalog that allows locating of files that are offline (on cards or USB drives) or across multiple systems is essential. Cataloging software can be used to keep track of data distributed across different storage media. Examples include abeMeda [140] and NeoFinder [141]. Cataloguing software can be coupled with cloud backup software to provide recovery, if needed.

Common UAS data file types include orthomosaic rasters (e.g., tiff, geotiff, HDF5, and NetCDF) of spectral indices, as well as dense point clouds (e.g., las, laz, bil, and ply). UAS datasets are highly heterogeneous and epitomize the “long tail” of research data. Unstructured data are typically the largest and also the least informative. Unstructured data, stored on local hard disks or in cloud-based object storage (buckets), have significant input-output (IO) requirements, which make moving, reading, or writing of large datasets slow and impractical at scale. Critically, UAS data are also at risk of becoming “dark data” [142]—either lost or becoming unusable by the rest of the science community. In order to make large quantities of data more available for analyses, these data need to be given structure in the form of an index. Structured indices, e.g., PostgreSQL with PostGIS extension [143], MongoDB [144, 145], and ElasticSearch (based on Apache Lucene) [146], allow rapid search and query of UAS data. Indexing of UAS data is critical to its findability, accessibility, and reuse. However, these require dedicated cyberinfrastructure hardware for hosting of indices and technical expertise. Recent work has worked on extracting traits from images, while reducing data size and storage needs [147].

Enterprise processing software (e.g., ESRI, Pix4D, and Agisoft) offer cloud storage at additional cost. OpenAerialMap provides hosting for limited extents. Cloud-based providers, e.g., DroneDeploy and FarmersEdge, offer enterprise solutions for raw image and orthomosaic data management. These solutions are most likely the easiest to use for novice UAS operators, but more expensive than hosting own services at scale for a mid- to large-scale research effort, e.g., a regional research laboratory or national research branch. Research needs differ from commercial solutions in several distinct ways, including the need to maintain and to curate data (often in perpetuity), and to provide provenance and sharing to ensure findable, accessible, interoperable, reusable (FAIR) data principles are met [148, 149].

3.4.3. Data Sharing and FAIR Principles

While collecting UAS-based data is important, extracting actionable scientific insight calls for good data curation, storage, sharing, and reuse [150]. This is especially true if substantial resources are expended in collecting large quantities of UAS-based imaging data, which can be used by multiple groups to answer complementary research questions. This requires adhering to metadata standards that are consistent with community-established needs. We encourage practitioners to consider reviewing best practices from the Open Geospatial Consortium (OGC) unmanned systems working group [151], as well as others, e.g., Sensor, Observation, Sample, and Actuator (SOSA) ontology [152] and dronetology [153].

3.4.4. Integration with External Systems and Extensibility

Analysis pipelines and workflows for UAS data range from “intricate” to “bespoke” by virtue of their specific use cases, number of steps required for processing, and preferred software. It is fairly common to exceed the computational resources available on a single server or workstation as the amount of data increases. Solutions require incorporation of workflow management systems (WMS) that support the ability to distribute tasks among distributed (external) computational resources (clouds, HPC, etc.) and manage the execution and recovery from failures while processing large volumes of data. WMS also afford the necessary reproducibility [154], by keeping track of input parameters used for applications and processed outputs for every step, with the ability to perform advanced analysis that requires parameters sweep, e.g., building models for ML applications. Example methods for reproducibility include the use of SDKs and APIs such as the Pix4DEngine, Agisoft Metashape Python or Java pipeline, and the OpenDroneMap ecosystem. Examples of WMS systems include ArcGIS workflow manager, Dask [155], Makeflow, and WorkQueue [155, 156].

Data derived from UAS analysis are often shared with stakeholders and users not conversant with UAS data products. The ability to rapidly review, iterate, and share data products, as well as gather and track user feedback, is important to improve data management. Use of online web services for data visualization can help to increase the speed at which teams can share and use data with tools like GoogleMaps API, QGIS Web Client Survey, and ArcGIS Online. Use of productivity applications for task management (e.g., Trello), source code repositories (e.g., GitHub), documentation (e.g., Atlassian Wiki, Read the Docs), and concurrent document editor (e.g., Overleaf and Google Docs) is central to ensuring the required productivity in groups with varied levels of expertise and familiarity. While many commercial turnkey solutions provide these capabilities as part of their integrated platform, utilizing a good data and analysis management strategy will allow the inclusion of more applications in any analysis pipeline through use of URI, webhooks, and API calls provided by each of these applications.

4. UAS-Based Imaging of Plant Traits

Combination of spectral wavebands and other predictor traits with ML-based analytics has shown utility in crop yield and physiological trait measurement and prediction [157, 158]. Similarly, integration of crop, genetic, and weather parameters shows usefulness in crop yield prediction using deep learning [159]. Also, ground robot-based organ level phenotyping in soybean has also shown success in field conditions [160]. These are just a few examples of the value of involving UAS-based phenotyping to increase the scale of phenotyping for improving crop yield. Broadly speaking, UAS-based remote sensing can be used to phenotype numerous traits, including (i) performance traits such as yield and its components, canopy biomass, growth and development, and physiological and morphological; (ii) plant health traits such as abiotic and biotic stresses; and (iii) chemistry: sugar, proteins, metabolites, and high-value chemicals. Figure 3 provides a schematic outline of the entire UAS-based pipeline that enables generation of plant trait information for breeding and research, as well as crop production applications. In Table 5 we cover recent literature with a focus on performance and plant stress traits; however, it must be noted that chemistry traits are also amenable with UAS phenotyping, although literature is sparse on the use of UAS for metabolites and chemicals phenotyping (see for example, [161]). More information specific to plant stress digital phenotyping can be found here [162164].

ICQPType of plant traitUAV typeFlight altitude (m)Image resolutionPlant speciesPlant trait analysis/modelSensor on UAVPlant phenotypeRef.

ClassificationMorphological and physiologicalMultirotor30-VineyardANNMultispectral sensorStem water potential, water stress[172]
QuantificationPhysiologicalMulti rotor50~2.2 cm and 1.11Winter wheatANN, SVM, RF, BBRT, DT, MLR, PLSR, and PCRHyperspectral and RGBAboveground biomass (AGB)[173]
QuantificationPhysiologicalMultirotor & fixed-wing40-Forest, soybean, SorghumANOVA, correlation and heritabilityThermal imagingWater stress[58]
QuantificationPhysiologicalMultirotor801.51 cm per pixelMaizeBroad-sense heritability and genetic correlationsRGBCrop cover and senescence[174]
QuantificationPhysiologicalMultirotor300.5 cmPotatoCorrelation, RFRGBCrop emergence[175]
IdentificationMorphological traitMultirotor755 cm/pixelCitrus treesDCNNMultispectralCounting trees[176]
QuantificationMorphologicalMultirotor40 and 5013 and 10 mm/pixelSorghumGenomic predictionRGB or near-infrared green and blue (NIR-GB)Plant height[27]
QuantificationPhysiological, abiotic stressMultirotor50, 1207.2, 3 cm/pixelDry beansGNDVI, correlationMultispectralSeed yield, biomass, flowering, drought[177]
Classification and quantificationPhysiologicalMultirotor251.5–3.5 cm per pixelWheatHeritability, correlation and GWASRGB and multispectralLodging[178]
QuantificationMorphological and physiological traitMultirotor50 (snapshot), (digital)WheatLinear regression, RF, PLSRRGB, spectroradiometer, and snapshot hyperspectral sensorCrop height, LAI, biomass[179]
QuantificationPhysiologicalMultirotor30, 402.5, 2.8 cmBread wheatLinear regressions, correlation matrix, and broad sense heritabilityMultispectralSenescence[180]
QuantificationPhysiologicalMultirotor755 cm/pixelCottonMixed linear modelMultispectralCrop WUE[181]
QuantificationPhysiologicalMultirotor50-MaizeMultitemporal modelling3D imaging and RGBAGB[182]
QuantificationBiotic stressMultirotor-0.8cmPotatoMultilayer perceptron and CNNRGB and multispectralLate blight severity[183]
QuantificationMorphologicalMultirotor3-8-Blueberry bushMultivariate analysisRGBHeight, extents, canopy area and volume canopy width, and diameter[184]
QuantificationBiotic stressMultirotor5.5, 27-RiceNDVI and correlationRGB and multispectralSheath blight[185]
QuantificationAbiotic stressMultirotor130.5 and 1.12 cmTomatoOBIARGB and multispectralSalinity stress plant area[186]
QuantificationBiotic stressMultirotor150.6 cmCottonOBIARGBCotton boll[187]
IdentificationBiotic stressMultirotor30, 600.01-0.03 m/pixelSunflowerOBIARGB, multispectralWeed[188]
QuantificationPhysiological and morphologicalMultirotor206-8 mmEggplant, tomato, cabbageRF and support vector regressionRGB imagesCrop height, biomass[189]
ClassificationBiotic stressFixed1500.08 m/pixelVineyardReceiver operator characteristic analysisMultispectralFlavescens dorée, grapevine trunk diseases[190]
QuantificationMorphologicalFixed-wing>1002.5, 5, 10, 20 cmMaizeRegressionRGBHeight[80]
QuantificationMorphologicalMultirotor50, 29, 130.01 mCottonRegressionRGBHeight[191]
QuantificationMorphologicalMultirotor52.51.13 cm/pixelMaizeRegressionRGBPlant height[192]
QuantificationPhysiologicalMultirotor35, 70, 1000.54, 1.09, and 1.57 cm)BarleyRegression analysisRGBLodging severity, canopy height[193]
QuantificationPhysiologicalMultirotor76 mmWheatRegression analysisRGBSeed emergence[194]
QuantificationMorphological and physiologicalMultirotor--WheatRegression analysisRGB imagesCanopy traits[195]
QuantificationMorphologicalMultirotor302.5 cm/pixelBread wheatRegression, QTL mapping, and genomic predictionRGB camera and 4 monochrome sensors (NIR, red, green, and red-edge)Plant height[196]
QuantificationMorphologicalMultirotor25-Oilseed rapeRF, regression analysisRGB and multispectralFlower number[197]
IdentificationBiotic stressMultirotor1, 2, 4, 8, 16-SoybeanSVM, KNNRGBFoliar diseases[198]
QuantificationMorphologicalMultirotor30, 50, 70-Lychee cropTree height, crown width, crown perimeter, and plant projective coverMultispectralCrop structural properties[199]
QuantificationPhysiologicalMultirotor40, 60-MaizeUnivariate and multivariate logistic regression modelsRGB and multispectralLodging[200]
QuantificationBiotic stressMultirotor80-BeetUnivariate decision treesHyperspectralBeet cyst nematode[201]
QuantificationBiotic stressMultirotor--PeanutVegetation indexMultispectralSpot wilt[202]
QuantificationMorphological and physiological traitsMultirotor20-CottonVegetation index, SVMMultispectralPlant height, canopy cover, vegetation index, and flower[203]
QuantificationPhysiologicalMultirotor1508.2 cmWheatVegetative indexMultispectralLAI[204]
IdentificationBiotic stressMultirotor~10-RadishVGG-A, CNNRGBFusarium wilt[205]

While the majority of these studies used higher flight altitude (>25 m), the UAS types used are predominantly multirotor and utilize a combination of non-ML approaches for analysis. The use of multirotor in more recent literature could be due to a study bias as these papers are research experiments. Due to the constraints of payload weight and battery drain, it is likely that in precision and digital agriculture applications, fixed-wings and high altitude UAS will be desirable to cover large tracts of land with trait-dependent pixel resolution and/or complemented with significant advancements in sensor hardware. Due to the focus of this review on research and breeding applications, we do not delve deeper into precision and digital agriculture applications; however, the principles broadly remain consistent. Due to the continual push on image-based phenotyping in research, breeding and digital agriculture, pixels will continue to become more important, as future research may attempt to achieve greater information per unit pixel that comes from more trait estimation and better granularity.

ML methods have been successfully utilized at multiple scales, for example, microscopic level for nematode egg count [165], organ or object detection in canopy [160, 163, 166] or roots [167170], yield prediction [157159], disease identification and quantification [48, 49], and abiotic stress identification and quantification [103, 104]. Tools are also being developed for plant scientists to reduce the barrier to entry for ML utilization for plant phenotyping tasks [171]. With the robust set of examples where ML has been successfully used in crop trait phenotyping with ground-based systems, the transferability to UAS-based phenotyping and trait information extraction should be less cumbersome.

UAS-based phenotyping systems provide many attractive features to advance crop breeding and research. These include simultaneous phenotyping of multiple traits, assessment of larger genetic panels, mapping more complex traits including canopy shape, rapid phenotyping saving time and resources, time series data collection, and improved accuracy of measurement. With the current software and platforms, the barrier to entry has been significantly reduced. In this review article, we covered deployment, data collection, curation, and storage, while not focusing on data analytics since this has been covered in other papers. Advanced data analytics, such as machine learning, and particularly deep learning approaches have transformed the field of UAS-based applications in multiple domains including plant sciences, as it allows extracting complex, nonlinear, and hierarchical features from multiple sensors, including but not limited to digital, multi-, and hyperspectral cameras. Machine learning for plant phenotyping has been covered previously in review articles [117, 162, 206].

We conclude this review by identifying three broad classes of challenges that currently bottleneck increased and diverse use of UAS for plant phenotyping:

5.1. Challenges Associated with Information Constraints

The amount of useful information that can be extracted from the UAS payload determines the utility of the phenotyping exercise. Some of the pressing challenges associated with extracting viable information from UAS payloads include:(a)Low resolution: UASs have lower resolution when compared to ground-based digital phenotyping campaigns. Especially with multispectral and hyperspectral imaging, the (spatial and spectral) lower resolution of UAS limits extracting fine-scale features at the individual plant scale. Promising approaches will rely on concepts of spatial- and spectral- superresolution, as well as PAN sharpening. Ongoing research seeks to obtain more information per pixel using these strategies [207209], which will enable more traits to be estimated with better granularity. We envision that superresolution and PAN sharpening analysis will become more prominent as it attempts to infer subpixel information from data and maps between low- and high-resolution images collected from different UASs. These developments will also advance remote sensing capabilities to provide proximal level sensing including with smartphones [104](b)Coregistering multiple sensors: complex traits can be extracted if multiple sensors (thermal, RGB, multispectral) measure the same object. However, with sensors exhibiting different measurement frequencies as well as spatial resolution, accurately coregistering the sensor stream is an important prerequisite for viable trait extraction. Physical infield controls, and ML-based semantic segmentation and registration tools will be needed to perform seamless coregistration of data coming from different sensors. This also creates further complexity in data fusion for real time in situ processing as well as offline, deferred analytics. While not necessarily a constraint of UAS, this is an important factor for downstream image analysis for trait extraction and coregistering(c)Standardizing methods for complex trait extraction: a persistent challenge remains our (lack of) ability to evaluate protocols for trait extraction without very resource intensive ground truthing. This is especially true for the conversion of 2D images into 3D point clouds. For instance, a presumably simple trait like canopy height remains a challenge. There is (not yet) a standard approach to height calculation based on SfM [28, 30, 36, 40, 210, 211], which is due to issues of wind, quality of 3D point reconstruction, and lack of consistent approaches to evaluating developed techniques. This issue is exacerbated for more complex canopy traits (especially time series data) due to wind effects and occlusion, as well complex plant organs. Recent approaches to overcome this challenge are the use of LiDAR in conjunction to SfM. Also, coupling of ground robotic systems [212] with UAS may be desirable to phenotype traits obscured from the UAS(d)Automated Plot Segmentation and Labeling: another active area of research is plot segmentation with minimal previous work on automatic microplot segmentation using UAS data. Generally, a polygon of each plot is drawn manually or semiautomatically using GIS-based software such as QGIS or ArcGIS [30, 174, 210]; therefore, a fully automated solution is desirable especially in a breeding program that involves thousands to hundreds of thousands plots [14](e)ML and DL problem: ML and DL methods for plant phenotyping are an active area of research, and we suggest readers who are interested in this analysis refer to [162, 164, 206] as a starting point. While ML and DL are useful tools for UAS phenotyping, care needs to be taken to ensure that the data and problems trying to be solved are compatible with these methods (this includes large data size and variability). An appropriate choice of supervised or unsupervised ML methods is also crucial. In supervised learning, large labeling sets are needed, and in such cases, active learning may be useful [213]

5.2. Challenges Associated with Power Constraints

Due to current battery power limitation of UASs, large fields cannot be phenotyped efficiently. Current solution for covering a large field is to change the battery frequently, but it requires increased investment in batteries, and additionally, opens up issues of consistency caused by reboot of on board sensors. Several potential approaches are being explored to circumvent this constraint.(a)These include (i) on board energy harvesting to extend the flight capacity [10, 11], (ii) in situ processing to reduce the storage requirements [214], and (iii) environment aware flight planning to maximize the time the UAS can stay afloat [77]. Additionally, mobile charging stations built on solar and other renewable energy sources have the potential to overcome the power constraints and increase operational flexibility(b)Development of new sensors that integrate multiple capabilities along with improved GPS systems is also needed. As battery efficiency continually improves, sensors and on-board processing units with reduced energy demand are needed to overcome the hardware constraint(c)Another promising option is via swarm UAS systems [215]. Swarm UAS systems are systems where multiple UAS autonomously traverse the field, collect data, perform data fusion (from multiple sensors), and provide improved overlap, and hence, increased area coverage [216]. However, regulation currently prevents UAS flights from swarming in an autonomous manner in many countries, including the USA. In this context, we note that recently Oklahoma State University received an approval for one pilot to operate multiple UASs in national space

5.3. Challenges Associated with Policy Constraints: UAS Operation Certification and Policy Advances

As the application of UAS is rapidly gaining prominence in multiple disciplines, there is a need for a cohesive voice from practitioners to help shape policies around certification and utilizations. For example, flights near restricted spaces can be a challenge for production or research fields in the vicinity of such restricted spaces. Additionally, there are limitations on UAS usage such as delivery of crop protection products in commercial fields. With continual advancements in payload and sensor capabilities, we expect policies will be modified to further the use for UAS for agricultural applications; however, most research/breeding programs do not face this constraint. We advocate for greater involvement of practitioners to enable appropriate framing of policy.

We conclude by emphasizing that UAS systems are a very versatile and powerful approach for high-throughput phenotyping. While challenges remain, current developments suggest that the future is very promising for deployment of these systems for a diverse array of plant phenotyping tasks.


AGB:Aboveground biomass
ANN:Artificial neural network
BBRT:Boosted binary regression tree
CNN:Convolutional neural network
DT:Decision tree
DCNN:Deep convolutional neural network
GWAS:Genome-wide association study
GNDVI:Green normalized difference vegetation index
KNN:-nearest neighbor
LAI:Leaf area index
MLR:Multivariable linear regression
NDVI:Normalized difference vegetation index
OBIA:Object-based image analysis
PLSR:Partial least squares regression
PCR:Principal component regression
RF:Random forest
SVM:Support vector machine
VGG:Visual geometry group
WUE:Water use efficiency.

Conflicts of Interest

The authors declare no competing interests.

Authors’ Contributions

W.G., A.K.S., and B.G. conceptualized the paper. M.E.C., W.G., A.K.S., and B.G. wrote the first draft. All authors contributed to writing and reviewing the paper. Wei Guo and Matthew E. Carroll are joint first authors.


We thank all members of the ISU’s Soynomics team for their feedback on this work. We also thank all technical specialists of the Institute for Sustainable Agro-ecosystem Services, University of Tokyo. This work was partially supported by the Iowa Soybean Association (AS and AKS); the Plant Sciences Institute (BG, AKS, and SS); the Bayer Chair in Soybean Breeding (AKS); the R.F. Baker Center for Plant Breeding (AKS); the USDA National Institute of Food and Agriculture (NIFA) Food and Agriculture Cyberinformatics Tools (FACT) (award 2019-67021-29938) (AS, BG, SS, AKS, and NM); the NSF (S&CC-1952045) (AKS and SS); the USDA-CRIS (IOW04714) project (AKS and AS); the NSF (DBI-1265383) and (DBI-1743442) CyVerse (TS, NM); and the USDA NIFA (awards 2020-67021-31528 and 2020-68013-30934) (BG). This work was also supported by the CREST Program (JPMJCR1512) and the SICORP Program (JPMJSC16H2) (WG) of the Japan Science and Technology Agency, Japan.


  1. Public Law 112–95, 2012,
  2. D. P. Singh, A. K. Singh, and A. Singh, Plant Breeding and Cultivar Development, Academic Press, 2021.
  3. C. Xie and C. Yang, “A review on plant high-throughput phenotyping traits using UAV-based sensors,” Computers and Electronics in Agriculture, vol. 178, p. 105731, 2020. View at: Publisher Site | Google Scholar
  4. G. Yang, J. Liu, C. Zhao et al., “Unmanned aerial vehicle remote sensing for field-based crop phenotyping: current status and perspectives,” Frontiers in Plant Science, vol. 8, p. 1111, 2017. View at: Publisher Site | Google Scholar
  5. A. G. Korchenko and O. S. Illyash, “The generalized classification of unmanned air vehicles,” in 2013 IEEE 2nd International Conference Actual Problems of Unmanned Air Vehicles Developments Proceedings (APUAVD), pp. 28–34, Kiev, Ukraine, 2013. View at: Publisher Site | Google Scholar
  6. S. Sankaran, L. R. Khot, C. Z. Espinoza et al., “Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: a review,” European Journal of Agronomy, vol. 70, pp. 112–123, 2015. View at: Publisher Site | Google Scholar
  7. Y. Shi, J. A. Thomasson, S. C. Murray et al., “Unmanned aerial vehicles for high-throughput phenotyping and agronomic research,” PLoS One, vol. 11, no. 7, article e0159781, 2016. View at: Publisher Site | Google Scholar
  8. Yamaha Motor Co, LTD, Yamaha Motor History,
  9. B. Galkin, J. Kibilda, and L. A. DaSilva, “UAVs as mobile infrastructure: addressing battery lifetime,” IEEE Communications Magazine, vol. 57, no. 6, pp. 132–137, 2019. View at: Publisher Site | Google Scholar
  10. D. Lee, J. Zhou, and W. T. Lin, “Autonomous battery swapping system for quadcopter,” in 2015 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 118–124, Denver, CO, USA, 2015. View at: Publisher Site | Google Scholar
  11. T. Campi, S. Cruciani, and M. Feliziani, “Wireless power transfer technology applied to an autonomous electric UAV with a small secondary coil,” Energies, vol. 11, no. 2, p. 352, 2018. View at: Publisher Site | Google Scholar
  12. Skycharge - high power drone charging pad and infrastructure,
  13. Agronomic field trials,
  14. L. Tresch, Y. Mu, A. Itoh et al., “Easy MPE: extraction of quality microplot images for UAV-based high-throughput field phenotyping,” Plant Phenomics, vol. 2019, article 2591849, pp. 1–9, 2019. View at: Publisher Site | Google Scholar
  15. F. I. Matias, M. V. Caraza-Harter, and J. B. Endelman, “FIELDimageR: an R package to analyze orthomosaic images from agricultural field trials,” The Plant Phenome Journal, vol. 3, no. 1, 2020. View at: Publisher Site | Google Scholar
  16. UAVPP, 2020,
  17. O. Özyeşil, V. Voroninski, R. Basri, and A. Singer, “A survey of structure from motion,” Acta Numerica, vol. 26, pp. 305–364, 2017. View at: Google Scholar
  18. S. Wu, W. Wen, B. Xiao et al., “An accurate skeleton extraction approach from 3D point clouds of maize plants,” Frontiers in Plant Science, vol. 10, p. 248, 2019. View at: Publisher Site | Google Scholar
  19. R. Retkute, A. J. Townsend, E. H. Murchie, O. E. Jensen, and S. P. Preston, “Three-dimensional plant architecture and sunlit–shaded patterns: a stochastic model of light dynamics in canopies,” Annals of Botany, vol. 122, no. 2, pp. 291–302, 2018. View at: Publisher Site | Google Scholar
  20. P. Wilkes, A. Lau, M. Disney et al., “Data acquisition considerations for terrestrial laser scanning of forest plots,” Remote Sensing of Environment, vol. 196, pp. 140–153, 2017. View at: Publisher Site | Google Scholar
  21. T. T. Santos and G. C. Rodrigues, “Flexible three-dimensional modeling of plants using low- resolution cameras and visual odometry,” Machine Vision and Applications, vol. 27, no. 5, pp. 695–707, 2016. View at: Publisher Site | Google Scholar
  22. J. Hackenberg, H. Spiecker, K. Calders, M. Disney, and P. Raumonen, “SimpleTree—an efficient open source tool to build tree models from TLS clouds,” For Trees Livelihoods, vol. 6, pp. 4245–4294, 2015. View at: Google Scholar
  23. S. P. Bemis, S. Micklethwaite, D. Turner et al., “Ground-based and UAV-based photogrammetry: a multi-scale, high-resolution mapping tool for structural geology and paleoseismology,” Journal of Structural Geology, vol. 69, pp. 163–178, 2014. View at: Publisher Site | Google Scholar
  24. Unmanned Aircraft Systems (UAS), 2020,
  25. Civil Aviation Bureau: Japan’s safety rules on unmanned aircraft (UA)/drones - MLIT Ministry of Land, Infrastructure, Transport and Tourism, 2020,
  26. A. Polat, This map shows you the drone laws for every country in the world (updated regularly), 2017,
  27. K. Watanabe, W. Guo, K. Arai et al., “High-throughput phenotyping of Sorghum plant height using an unmanned aerial vehicle and its application to genomic prediction modeling,” Frontiers in Plant Science, vol. 8, p. 421, 2017. View at: Publisher Site | Google Scholar
  28. F. Iqbal, A. Lucieer, K. Barry, and R. Wells, “Poppy crop height and capsule volume estimation from a single UAS flight,” Remote Sensing, vol. 9, no. 7, p. 647, 2017. View at: Publisher Site | Google Scholar
  29. J. Torres-Sánchez, F. López-Granados, N. Serrano, O. Arquero, and J. M. Peña, “High-throughput 3-D monitoring of agricultural-tree plantations with unmanned aerial vehicle (UAV) technology,” PLoS One, vol. 10, article e0130479, 2015. View at: Publisher Site | Google Scholar
  30. X. Wang, D. Singh, S. Marla, G. Morris, and J. Poland, “Field-based high-throughput phenotyping of plant height in sorghum using different sensing technologies,” Plant Methods, vol. 14, p. 53, 2018. View at: Publisher Site | Google Scholar
  31. F. Gnädinger and U. Schmidhalter, “Digital counts of maize plants by unmanned aerial vehicles (UAVs),” Remote Sensing, vol. 9, p. 544, 2017. View at: Publisher Site | Google Scholar
  32. M. Schirrmann, A. Hamdorf, A. Garz, A. Ustyuzhanin, and K.-H. Dammer, “Estimating wheat biomass by combining image clustering with crop height,” Computers and Electronics in Agriculture, vol. 121, pp. 374–384, 2016. View at: Publisher Site | Google Scholar
  33. X. Jin, S. Liu, F. Baret, M. Hemerlé, and A. Comar, “Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery,” Remote Sensing of Environment, vol. 198, pp. 105–114, 2017. View at: Publisher Site | Google Scholar
  34. S. Madec, F. Baret, B. de Solan et al., “High-throughput phenotyping of plant height: comparing unmanned aerial vehicles and ground LiDAR estimates,” Frontiers in Plant Science, vol. 8, p. 2002, 2017. View at: Publisher Site | Google Scholar
  35. J. Senthilnath, A. Dokania, M. Kandukuri, K. N. Ramesh, G. Anand, and S. N. Omkar, “Detection of tomatoes using spectral-spatial methods in remotely sensed RGB images captured by UAV,” Biosystems engineering, vol. 146, pp. 16–32, 2016. View at: Publisher Site | Google Scholar
  36. A. Chang, J. Jung, M. M. Maeda, and J. Landivar, “Crop height monitoring with digital imagery from unmanned aerial system (UAS),” Computers and Electronics in Agriculture, vol. 141, pp. 232–237, 2017. View at: Publisher Site | Google Scholar
  37. U. Lussem, J. Hollberg, J. Menne, J. Schellberg, and G. Bareth, “Using calibrated RGB imagery from low-cost UAVs for grassland monitoring: case study at the Rengen Grassland Experiment (RGE), Germany,” The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 42, p. 229, 2017. View at: Google Scholar
  38. X. Han, J. A. Thomasson, G. C. Bagnall et al., “Measurement and calibration of plant-height from fixed-wing UAV images,” Sensors, vol. 18, no. 12, p. 4092, 2018. View at: Publisher Site | Google Scholar
  39. P. Hu, S. C. Chapman, X. Wang et al., “Estimation of plant height using a high throughput phenotyping platform based on unmanned aerial vehicle and self-calibration: example for sorghum breeding,” European Journal of Agronomy, vol. 95, pp. 24–32, 2018. View at: Publisher Site | Google Scholar
  40. F. H. Holman, A. B. Riche, A. Michalski, M. Castle, M. J. Wooster, and M. J. Hawkesford, “High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing,” Remote Sensing, vol. 8, p. 1031, 2016. View at: Publisher Site | Google Scholar
  41. R. A. Díaz-Varela, R. De la Rosa, L. León, and P. J. Zarco-Tejada, “High-resolution airborne UAV imagery to assess olive tree crown parameters using 3D photo reconstruction: application in breeding trials,” Remote Sensing, vol. 7, pp. 4213–4232, 2015. View at: Publisher Site | Google Scholar
  42. P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, and C. Stachniss, “UAV-based crop and weed classification for smart farming,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 3024–3031, Singapore, 2017. View at: Publisher Site | Google Scholar
  43. J. Ribera, F. He, Y. Chen, A. F. Habib, and E. J. Delp, “Estimating phenotypic traits from UAV based RGB imagery,” 2018, View at: Google Scholar
  44. B. E. McNeil, J. Pisek, H. Lepisk, and E. A. Flamenco, “Measuring leaf angle distribution in broadleaf canopies using UAVs,” Agricultural and Forest Meteorology, vol. 218-219, pp. 204–208, 2016. View at: Publisher Site | Google Scholar
  45. W. van Iersel, M. Straatsma, E. Addink, and H. Middelkoop, “Monitoring height and greenness of non-woody floodplain vegetation with UAV time series,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 141, pp. 112–123, 2018. View at: Publisher Site | Google Scholar
  46. H. G. Jones and R. A. Vaughan, Remote sensing of vegetation: principles, techniques, and applications, Oxford University Press, 2010,
  47. S. Seager, E. L. Turner, J. Schafer, and E. B. Ford, “Vegetation’s red edge: a possible spectroscopic biosignature of extraterrestrial plants,” Astrobiology, vol. 5, no. 3, pp. 372–390, 2005. View at: Publisher Site | Google Scholar
  48. K. Nagasubramanian, S. Jones, S. Sarkar, A. K. Singh, A. Singh, and B. Ganapathysubramanian, “Hyperspectral band selection using genetic algorithm and support vector machines for early identification of charcoal rot disease in soybean stems,” Plant Methods, vol. 14, p. 86, 2018. View at: Publisher Site | Google Scholar
  49. K. Nagasubramanian, S. Jones, A. K. Singh, S. Sarkar, A. Singh, and B. Ganapathysubramanian, “Plant disease identification using explainable 3D deep learning on hyperspectral images,” Plant Methods, vol. 15, p. 98, 2019. View at: Publisher Site | Google Scholar
  50. M. R. Krause, L. González-Pérez, J. Crossa et al., “Hyperspectral reflectance-derived relationship matrices for genomic prediction of grain yield in wheat,” G3: Genes, Genomes, Genetics, vol. 9, no. 4, pp. 1231–1247, 2019. View at: Publisher Site | Google Scholar
  51. D. Constantin, M. Rehak, Y. Akhtman, and F. Liebisch, “Hyperspectral remote sensing of crop properties with unmanned aerial vehicles,” 9th EARSeL SIG Imaging Spectroscopy Workshop, 2015, View at: Google Scholar
  52. J. Gao, D. Nuyttens, P. Lootens, Y. He, and J. G. Pieters, “Recognising weeds in a maize crop using a random forest machine-learning algorithm and near-infrared snapshot mosaic hyperspectral imagery,” Biosystems Engineering, vol. 170, pp. 39–50, 2018. View at: Publisher Site | Google Scholar
  53. K. R. Thorp, G. Wang, K. F. Bronson, M. Badaruddin, and J. Mon, “Hyperspectral data mining to identify relevant canopy spectral features for estimating durum wheat growth, nitrogen status, and grain yield,” Computers and Electronics in Agriculture, vol. 136, pp. 1–12, 2017. View at: Publisher Site | Google Scholar
  54. T. J. Nigon, D. J. Mulla, C. J. Rosen et al., “Hyperspectral aerial imagery for detecting nitrogen stress in two potato cultivars,” Computers and Electronics in Agriculture, vol. 112, pp. 36–46, 2015. View at: Publisher Site | Google Scholar
  55. A. Burkart, H. Aasen, L. Alonso, G. Menz, G. Bareth, and U. Rascher, “Angular dependency of hyperspectral measurements over wheat characterized by a novel UAV based goniometer,” Remote Sensing, vol. 7, pp. 725–746, 2015. View at: Publisher Site | Google Scholar
  56. A. Capolupo, L. Kooistra, C. Berendonk, L. Boccia, and J. Suomalainen, “Estimating plant traits of grasslands from UAV-acquired hyperspectral images: a comparison of statistical approaches,” ISPRS International Journal of Geo-Information, vol. 4, pp. 2792–2820, 2015. View at: Publisher Site | Google Scholar
  57. T. Sankey, J. Donager, J. McVay, and J. B. Sankey, “UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA,” Remote Sensing of Environment, vol. 195, pp. 30–43, 2017. View at: Publisher Site | Google Scholar
  58. V. Sagan, M. Maimaitijiang, P. Sidike et al., “UAV-based high resolution thermal imaging for vegetation monitoring, and plant phenotyping using ICI 8640 P, FLIR Vue Pro R 640, and thermomap cameras,” Remote Sensing, vol. 11, p. 330, 2019. View at: Publisher Site | Google Scholar
  59. H. Sheng, H. Chao, C. Coopmans, J. Han, M. McKee, and Y. Chen, “Low-cost UAV-based thermal infrared remote sensing: platform, calibration and applications,” in Proceedings of 2010 IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, QingDao, China, 2010. View at: Publisher Site | Google Scholar
  60. P. L. Raeva, J. Šedina, and A. Dlesk, “Monitoring of crop fields using multispectral and thermal imagery from UAV,” European Journal of Remote Sensing, vol. 52, pp. 192–201, 2019. View at: Publisher Site | Google Scholar
  61. R. Ludovisi, F. Tauro, R. Salvati, S. Khoury, G. M. Scarascia, and A. Harfouche, “UAV-based thermal imaging for high-throughput field phenotyping of black poplar response to drought,” Frontiers in Plant Science, vol. 8, 2017. View at: Publisher Site | Google Scholar
  62. J. Sofonia, Y. Shendryk, S. Phinn, C. Roelfsema, F. Kendoul, and D. Skocaj, “Monitoring sugarcane growth response to varying nitrogen application rates: a comparison of UAV SLAM LiDAR and photogrammetry,” International Journal of Applied Earth Observation and Geoinformation, vol. 82, p. 101878, 2019. View at: Publisher Site | Google Scholar
  63. Y.-C. Lin and A. Habib, “Quality control and crop characterization framework for multi-temporal UAV LiDAR data over mechanized agricultural fields,” Remote Sensing of Environment, vol. 256, p. 112299, 2021. View at: Publisher Site | Google Scholar
  64. M. P. Christiansen, M. S. Laursen, R. N. Jørgensen, S. Skovsen, and R. Gislum, “Designing and testing a UAV mapping system for agricultural field surveying,” Sensors, vol. 17, no. 12, p. 2703, 2017. View at: Publisher Site | Google Scholar
  65. Y. Shendryk, J. Sofonia, R. Garrard, Y. Rist, D. Skocaj, and P. Thorburn, “Fine-scale prediction of biomass and leaf nitrogen content in sugarcane using UAV LiDAR and multispectral imaging,” International Journal of Applied Earth Observation and Geoinformation, vol. 92, p. 102177, 2020. View at: Publisher Site | Google Scholar
  66. L. Zhou, X. Gu, S. Cheng, G. Yang, M. Shu, and Q. Sun, “Analysis of plant height changes of lodged maize using UAV-LiDAR data,” Agriculture, vol. 10, p. 146, 2020. View at: Publisher Site | Google Scholar
  67. T. Hu, X. Sun, Y. Su et al., “Development and performance evaluation of a very low-cost UAV-Lidar system for forestry applications,” Remote Sensing, vol. 13, p. 77, 2020. View at: Publisher Site | Google Scholar
  68. S. Jin, X. Sun, F. Wu et al., “Lidar sheds new light on plant phenomics for plant breeding and management: recent advances and future prospects,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 171, pp. 202–223, 2021. View at: Publisher Site | Google Scholar
  69. J. Torres-Sánchez, F. López-Granados, A. I. De Castro, and J. M. Peña-Barragán, “Configuration and specifications of an unmanned aerial vehicle (UAV) for early site specific weed management,” PLoS One, vol. 8, article e58210, 2013. View at: Publisher Site | Google Scholar
  70. I. Sa, M. Popović, R. Khanna et al., “WeedMap: a large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming,” Remote Sensing, vol. 10, p. 1423, 2018. View at: Publisher Site | Google Scholar
  71. C. Yang, “A high-resolution airborne four-camera imaging system for agricultural remote sensing,” Computers and Electronics in Agriculture, vol. 88, pp. 13–24, 2012. View at: Publisher Site | Google Scholar
  72. SfM, 2020,
  73. G. Forlani, E. Dall’Asta, F. Diotri, C. di UM, R. Roncella, and M. Santise, “Quality assessment of DSMs produced from UAV flights georeferenced with on-board RTK positioning,” Remote Sensing, vol. 10, no. 2, p. 311, 2018. View at: Publisher Site | Google Scholar
  74. T. Tonkin and N. Midgley, “Ground-control networks for image based surface reconstruction: an investigation of optimum survey designs using UAV derived imagery and structure-from-motion photogrammetry,” Remote Sensing, vol. 8, p. 786, 2016. View at: Publisher Site | Google Scholar
  75. J. Wang, Y. Ge, G. B. M. Heuvelink, C. Zhou, and D. Brus, “Effect of the sampling design of ground control points on the geometric correction of remotely sensed imagery,” International Journal of Applied Earth Observation and Geoinformation, vol. 18, pp. 91–100, 2012. View at: Publisher Site | Google Scholar
  76. A. A. Hearst and K. A. Cherkauer, “Research article: extraction of small spatial plots from geo-registered UAS imagery of crop fields,” Environmental Practice, vol. 17, pp. 178–187, 2015. View at: Publisher Site | Google Scholar
  77. L. Roth, A. Hund, and H. Aasen, “PhenoFly Planning Tool: flight planning for high-resolution optical remote sensing with unmanned areal systems,” Plant Methods, vol. 14, no. 1, 2018. View at: Publisher Site | Google Scholar
  78. V.-E. Oniga, A.-I. Breaban, and F. Statescu, “Determining the optimum number of ground control points for obtaining high precision results based on UAS images,” Proceedings, vol. 2, no. 7, p. 352, 2018. View at: Publisher Site | Google Scholar
  79. F.-J. Mesas-Carrascosa, J. Torres-Sánchez, I. Clavero-Rumbao et al., “Assessing optimal flight parameters for generating accurate multispectral orthomosaicks by UAV to support site-specific crop management,” Remote Sensing, vol. 7, pp. 12793–12814, 2015. View at: Publisher Site | Google Scholar
  80. M. G. Ziliani, S. D. Parkes, I. Hoteit, and M. F. McCabe, “Intra-season crop height variability at commercial farm scales using a fixed-wing UAV,” Remote Sensing, vol. 10, p. 2007, 2018. View at: Publisher Site | Google Scholar
  81. Ground control points for drone surveys & mapping, 2020,
  82. L. Roth, PhenoyFly Planning Tool, 2020,
  83. Tower, GitHub,
  84. Altizure, GitHub,
  85. Autopilot for DJI drones, 2020,
  86. DJI GS Pro, 2020,
  87. Data capture platform for drones & UAVs, 2020,
  88. Drone mapping app,
  89. eMotion-senseFly, 2017,
  90. Intel® Mission Control Software, 2020,
  91. Litchi for DJI Mavic / Phantom / Inspire / Spark, 2020,
  92. Maps made easy, Aerial map processing & hosting, 2020,
  93. mdCockpit app, 2020,
  94. Mission planner home—mission planner documentation, 2020,
  95. Pix4Dcapture: free drone flight planning mobile app, 2020,
  96. QGC-QGroundControl-drone control, 2020,
  97. SPH engineering / UgCS, Leading drone control software, 2020,
  98. J. Torres-Sánchez, F. López-Granados, I. Borra-Serrano, and J. M. Peña, “Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards,” Precision Agriculture, vol. 19, pp. 115–133, 2018. View at: Publisher Site | Google Scholar
  99. J. Besada, L. Bergesio, I. Campaña et al., “Drone mission definition and implementation for automated infrastructure inspection using airborne sensors,” Sensors, vol. 18, p. 1170, 2018. View at: Publisher Site | Google Scholar
  100. F. J. Mesas-Carrascosa, I. Clavero Rumbao, J. Torres-Sánchez, A. García-Ferrer, J. M. Peña, and G. F. López, “Accurate ortho-mosaicked six-band multispectral UAV images as affected by mission planning for precision agriculture proposes,” International Journal of Remote Sensing, vol. 38, pp. 2161–2176, 2017. View at: Publisher Site | Google Scholar
  101. How to verify that there is enough overlap between the images, 2020,
  102. A. Itoh and W. Guo, Mission planner leaflet,
  103. J. Zhang, H. S. Naik, T. Assefa et al., “Computer vision and machine learning for robust phenotyping in genome-wide studies,” Scientific Reports, vol. 7, p. 44048, 2017. View at: Publisher Site | Google Scholar
  104. H. S. Naik, J. Zhang, A. Lofquist et al., “A real-time phenotyping framework using machine learning for plant stress severity rating in soybean,” Plant Methods, vol. 13, p. 23, 2017. View at: Publisher Site | Google Scholar
  105. M. S. El-Faki, N. Zhang, and D. E. Peterson, “Factors affecting color-based weed detection,” Transactions of the ASAE, vol. 43, pp. 1001–1009, 2000. View at: Publisher Site | Google Scholar
  106. Y.-C. Chang and J. F. Reid, “RGB calibration for color image analysis in machine vision,” IEEE Transactions on Image Processing, vol. 5, pp. 1414–1422, 1996. View at: Publisher Site | Google Scholar
  107. J. Orava, T. Jaaskelainen, and J. Parkkinen, “Color errors of digital cameras,” Color Research and Application, vol. 29, pp. 217–221, 2004. View at: Publisher Site | Google Scholar
  108. S. Anaokar and M. Moeck, “Validation of high dynamic range imaging to luminance measurement,” Leukos, vol. 2, pp. 133–144, 2005. View at: Publisher Site | Google Scholar
  109. M. N. Inanici, “Evaluation of high dynamic range photography as a luminance data acquisition system,” Lighting Research and Technology, vol. 38, pp. 123–134, 2006. View at: Publisher Site | Google Scholar
  110. H. Aasen, A. Burkart, A. Bolten, and G. Bareth, “Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: from camera calibration to quality assurance,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 108, pp. 245–259, 2015. View at: Publisher Site | Google Scholar
  111. J. Pritsolas, R. Pearson, J. Connor, and P. Kyveryga, “Challenges and successes when generating in-season multi-temporal calibrated aerial imagery,” in 13th International Conference on Precision Agriculture, pp. 1–15, St. Louis, MO, USA, 2016. View at: Google Scholar
  112. M. Zaman-Allah, O. Vergara, J. L. Araus et al., “Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize,” Plant Methods, vol. 11, p. 35, 2015. View at: Publisher Site | Google Scholar
  113. T. Hakala, L. Markelin, E. Honkavaara et al., “Direct reflectance measurements from drones: sensor absolute radiometric calibration and system tests for forest reflectance characterization,” Sensors, vol. 18, 2018. View at: Publisher Site | Google Scholar
  114. H. Aasen, E. Honkavaara, A. Lucieer, and P. Zarco-Tejada, “Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: a review of sensor technology, measurement procedures, and data correction workflows,” Remote Sensing, vol. 10, p. 1091, 2018. View at: Publisher Site | Google Scholar
  115. F. Iqbal, A. Lucieer, and K. Barry, “Simplified radiometric calibration for UAS-mounted multispectral sensor,” European Journal of Remote Sensing, vol. 51, pp. 301–313, 2018. View at: Publisher Site | Google Scholar
  116. T. Miura and A. R. Huete, “Performance of three reflectance calibration methods for airborne hyperspectral spectrometer data,” Sensors, vol. 9, pp. 794–813, 2009. View at: Publisher Site | Google Scholar
  117. A. Bauer, A. G. Bostrom, J. Ball et al., “Combining computer vision and deep learning to enable ultra-scale aerial phenotyping and precision agriculture: a case study of lettuce production,” Horticulture research, vol. 6, p. 70, 2019. View at: Publisher Site | Google Scholar
  118. N. Merchant, E. Lyons, S. Goff et al., “The iPlant collaborative: cyberinfrastructure for enabling data to discovery for the life sciences,” PLoS Biology, vol. 14, article e1002342, 2016. View at: Publisher Site | Google Scholar
  119. J. Towns, T. Cockerill, M. Dahan et al., “XSEDE: accelerating scientific discovery,” Computing in Science & Engineering, vol. 16, pp. 62–74, 2014. View at: Publisher Site | Google Scholar
  120. M. Altunay, P. Avery, K. Blackburn et al., “A science driven production cyberinfrastructure—the open science grid,” International Journal of Grid and Utility Computing, vol. 9, pp. 201–218, 2011. View at: Publisher Site | Google Scholar
  121. mhopkins-msft, Optimize costs by automating Azure Blob Storage access tiers, 2020,
  122. Object lifecycle management, 2020,
  123. Object lifecycle management, 2020,
  124. Drone mapping software - OpenDroneMap, 2020,
  125. senseFly - senseFly – the professional’s mapping drone, 2020,
  126. Pix4Dfields: drone software for agriculture mapping, 2020,
  127. 3Dflow - computer vision specialists - home of 3DF Zephyr, 2020,
  128. Drone2Map, 2020,
  129. Drone mapping software, 2020,
  130. Digital farming solutions, 2020,
  131. FlytBase: enterprise drone automation platform, 2020,
  132. Agisoft Metashape, 2020,
  133. One drone cloud, 2020,
  134. Open Aerial Map, 2020,
  135. OpenSfM, 2020,
  136. Professional photogrammetry and drone mapping software, 2020,
  137. Geospatial data analytics for the enterprise, 2020,
  138. DroneMapper, 2020,
  139. Welcome to Skycatch, 2020,
  140. abeMeda, 2020,
  141. NeoFinder, 2020,
  142. H. P. Bryan, “Shedding light on the dark data in the long tail of science,” Library Trends, vol. 57, pp. 280–299, 2008. View at: Publisher Site | Google Scholar
  143. R. O. Obe and L. S. Hsu, PostGIS in action, Manning Greenwich, 2011.
  144. K. Banker, MongoDB in action, Manning Publications Co., Greenwich, CT, USA, 2011,
  145. K. Chodorow, MongoDB: the definitive guide: powerful and scalable data storage, O’Reilly Media, Inc., 2013,
  146. C. Gormley and Z. Tong, Elasticsearch: the definitive guide: a distributed real-time search and analytics engine, O’Reilly Media, Inc., 2015,
  147. T. Z. Jubery, J. Shook, K. Parmley et al., “Deploying Fourier coefficients to unravel soybean canopy diversity,” Frontiers in Plant Science, vol. 7, p. 2066, 2017. View at: Publisher Site | Google Scholar
  148. M. D. Wilkinson, M. Dumontier, I. J. Aalbersberg et al., “The FAIR guiding principles for scientific data management and stewardship,” Scientific Data, vol. 3, no. 1, article 160018, 2016. View at: Publisher Site | Google Scholar
  149. P. Neveu, A. Tireau, N. Hilgert et al., “Dealing with multi-source and multi-scale information in plant phenomics: the ontology-driven phenotyping hybrid information system,” The New Phytologist, vol. 221, pp. 588–601, 2019. View at: Publisher Site | Google Scholar
  150. J. Wyngaard, L. Barbieri, A. Thomer et al., “Emergent challenges for science sUAS data management: fairness through community engagement and best practices development,” Remote Sensing, vol. 11, p. 1797, 2019. View at: Publisher Site | Google Scholar
  151. UXS DWG, 2020,
  152. K. Janowicz, A. Haller, S. J. D. Cox, D. Le Phuoc, and M. Lefrancois, “SOSA: a lightweight ontology for sensors, observations, samples, and actuators,” 2018, View at: Google Scholar
  153. Dronetology, the UAV Ontology, 2020,
  154. G. K. Sandve, A. Nekrutenko, J. Taylor, and E. Hovig, “Ten simple rules for reproducible computational research,” PLoS Computational Biology, vol. 9, article e1003285, 2013. View at: Publisher Site | Google Scholar
  155. J. C. Daniel, Data Science at Scale with Python and Dask, Manning Publications, 2019,
  156. M. Albrecht, P. Donnelly, P. Bui, and D. Thain, “Makeflow: a portable abstraction for data intensive computing on clusters, clouds, and grids,” in Proceedings of the 1st ACM SIGMOD Workshop on Scalable Workflow Execution Engines and Technologies, pp. 1–13, New York, NY, USA, 2012. View at: Publisher Site | Google Scholar
  157. K. Parmley, K. Nagasubramanian, and S. Sarkar, “Development of optimized phenomic predictors for efficient plant breeding decisions using phenomic-assisted selection in soybean,” Plant Phenomics, vol. 2019, pp. 1–15, 2019. View at: Google Scholar
  158. K. A. Parmley, R. H. Higgins, B. Ganapathysubramanian, S. Sarkar, and A. K. Singh, “Machine learning approach for prescriptive plant breeding,” Scientific Reports, vol. 9, p. 17132, 2019. View at: Publisher Site | Google Scholar
  159. J. Shook, T. Gangopadhyay, and L. Wu, “Crop yield prediction integrating genotype and weather variables using deep learning,” 2020, View at: Google Scholar
  160. L. G. Riera, M. E. Carroll, Z. Zhang et al., “Deep multi-view image fusion for soybean yield estimation in breeding applications,” 2020, View at: Google Scholar
  161. J. Gago, A. R. Fernie, Z. Nikoloski et al., “Integrative field scale phenotyping for investigating metabolic components of water stress within a vineyard,” Plant Methods, vol. 13, p. 90, 2017. View at: Publisher Site | Google Scholar
  162. A. Singh, B. Ganapathysubramanian, A. K. Singh, and S. Sarkar, “Machine learning for high-throughput stress phenotyping in plants,” Trends in Plant Science, vol. 21, pp. 110–124, 2016. View at: Publisher Site | Google Scholar
  163. S. Ghosal, D. Blystone, A. K. Singh, B. Ganapathysubramanian, A. Singh, and S. Sarkar, “An explainable deep machine vision framework for plant stress phenotyping,” Proceedings of the National Academy of Sciences of the United States of America, vol. 115, pp. 4613–4618, 2018. View at: Publisher Site | Google Scholar
  164. A. Singh, S. Jones, B. Ganapathysubramanian et al., “Challenges and opportunities in machine-augmented plant stress phenotyping,” Trends in Plant Science, vol. 26, pp. 53–69, 2021. View at: Publisher Site | Google Scholar
  165. A. Akintayo, G. L. Tylka, A. K. Singh, B. Ganapathysubramanian, A. Singh, and S. Sarkar, “A deep learning framework to discern and count microscopic nematode eggs,” Scientific Reports, vol. 8, p. 9145, 2018. View at: Publisher Site | Google Scholar
  166. S. Ghosal, B. Zheng, S. C. Chapman et al., “A weakly supervised deep learning framework for sorghum head detection and counting,” Plant Phenomics, vol. 2019, pp. 1–14, 2019. View at: Google Scholar
  167. K. G. Falk, T. Z. Jubery, J. A. O’Rourke, and A. Singh, “Soybean root system architecture trait study through genotypic, phenotypic, and shape-based clusters,” Plant Phenomics, vol. 2020, pp. 1–23, 2020. View at: Publisher Site | Google Scholar
  168. K. G. Falk, T. Z. Jubery, S. V. Mirnezami et al., “Computer vision and machine learning enabled soybean root phenotyping pipeline,” Plant Methods, vol. 16, p. 5, 2020. View at: Publisher Site | Google Scholar
  169. T. Z. Jubery, C. N. Carley, A. Singh, S. Sarkar, B. Ganapathysubramanian, and A. K. Singh, Using machine learning to develop a fully automated soybean nodule acquisition pipeline (SNAP), bioRxiv, 2020,
  170. M. P. Pound, J. A. Atkinson, A. J. Townsend et al., “Deep machine learning provides state-of-the-art performance in image-based plant phenotyping,” Gigascience, vol. 6, pp. 1–10, 2017. View at: Publisher Site | Google Scholar
  171. J. R. Ubbens and I. Stavness, “Deep plant phenomics: a deep learning platform for complex plant phenotyping tasks,” Frontiers in Plant Science, vol. 8, 2017. View at: Publisher Site | Google Scholar
  172. M. Romero, Y. Luo, B. Su, and S. Fuentes, “Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management,” Computers and Electronics in Agriculture, vol. 147, pp. 109–117, 2018. View at: Publisher Site | Google Scholar
  173. J. Yue, H. Feng, G. Yang, and Z. Li, “A comparison of regression techniques for estimation of above-ground winter wheat biomass using near-surface spectroscopy,” Remote Sensing, vol. 10, p. 66, 2018. View at: Publisher Site | Google Scholar
  174. R. Makanza, M. Zaman-Allah, J. Cairns et al., “High-throughput phenotyping of canopy cover and senescence in maize field trials using aerial digital canopy imaging,” Remote Sensing, vol. 10, p. 330, 2018. View at: Publisher Site | Google Scholar
  175. B. Li, X. Xu, J. Han et al., “The estimation of crop emergence in potatoes by UAV RGB imagery,” Plant Methods, vol. 15, p. 15, 2019. View at: Publisher Site | Google Scholar
  176. Y. Ampatzidis and V. Partel, “UAV-based high throughput phenotyping in citrus utilizing multispectral imaging and artificial intelligence,” Remote Sensing, vol. 11, p. 410, 2019. View at: Publisher Site | Google Scholar
  177. S. Sankaran, J. Zhou, L. R. Khot, J. J. Trapp, E. Mndolwa, and P. N. Miklas, “High-throughput field phenotyping in dry bean using small unmanned aerial vehicle based multispectral imagery,” Computers and Electronics in Agriculture, vol. 151, pp. 84–92, 2018. View at: Publisher Site | Google Scholar
  178. D. Singh, X. Wang, U. Kumar et al., “High-throughput phenotyping enabled genetic dissection of crop lodging in wheat,” Frontiers in Plant Science, vol. 10, p. 394, 2019. View at: Publisher Site | Google Scholar
  179. J. Yue, H. Feng, X. Jin et al., “A comparison of crop parameters estimation using images from UAV-mounted snapshot hyperspectral sensor and high-definition digital camera,” Remote Sensing, vol. 10, p. 1138, 2018. View at: Publisher Site | Google Scholar
  180. M. A. Hassan, M. Yang, A. Rasheed et al., “Time-series multispectral indices from unmanned aerial vehicle imagery reveal senescence rate in bread wheat,” Remote Sensing, vol. 10, p. 809, 2018. View at: Publisher Site | Google Scholar
  181. K. R. Thorp, A. L. Thompson, S. J. Harders, A. N. French, and R. W. Ward, “High-throughput phenotyping of crop water use efficiency via multispectral drone imagery and a daily soil water balance model,” Remote Sensing, vol. 10, p. 1682, 2018. View at: Publisher Site | Google Scholar
  182. A. Michez, S. Bauwens, Y. Brostaux et al., “How far can consumer-grade UAV RGB imagery describe crop production? A 3D and multitemporal modeling approach applied to zea mays,” Remote Sensing, vol. 10, p. 1798, 2018. View at: Publisher Site | Google Scholar
  183. J. M. Duarte-Carvajalino, D. F. Alzate, A. A. Ramirez, J. D. Santa-Sepulveda, A. E. Fajardo-Rojas, and M. Soto-Suárez, “Evaluating late blight severity in potato crops using unmanned aerial vehicles and machine learning algorithms,” Remote Sensing, vol. 10, p. 1513, 2018. View at: Publisher Site | Google Scholar
  184. A. Patrick and C. Li, “High throughput phenotyping of blueberry bush morphological traits using unmanned aerial systems,” Remote Sensing, vol. 9, p. 1250, 2017. View at: Publisher Site | Google Scholar
  185. D. Zhang, X. Zhou, J. Zhang, Y. Lan, C. Xu, and D. Liang, “Detection of rice sheath blight using an unmanned aerial system with high-resolution color and multispectral imaging,” PLoS One, vol. 13, article e0187470, 2018. View at: Publisher Site | Google Scholar
  186. K. Johansen, M. J. L. Morton, Y. M. Malbeteau et al., “Unmanned aerial vehicle-based phenotyping using morphometric and spectral analysis can quantify responses of wild tomato plants to salinity stress,” Frontiers in Plant Science, vol. 10, p. 370, 2019. View at: Publisher Site | Google Scholar
  187. J. Yeom, J. Jung, A. Chang, M. Maeda, and J. Landivar, “Automated open cotton boll detection for yield estimation using unmanned aircraft vehicle (UAV) data,” Remote Sensing, vol. 10, p. 1895, 2018. View at: Publisher Site | Google Scholar
  188. A. I. de Castro, F. J. Mesas-Carrascosa, and J. M. Pena, “Early season weed mapping in sunflower using UAV technology: variability of herbicide treatment maps against weed thresholds,” Precision, vol. 17, no. 2, pp. 183–199, 2016. View at: Google Scholar
  189. T. Moeckel, S. Dayananda, R. R. Nidamanuri et al., “Estimation of vegetable crop parameter by multi-temporal UAV-borne images,” Remote Sensing, vol. 10, p. 805, 2018. View at: Publisher Site | Google Scholar
  190. J. Albetis, A. Jacquin, M. Goulard et al., “On the potentiality of UAV multispectral imagery to detect flavescence dorée and grapevine trunk diseases,” Remote Sensing, vol. 11, p. 23, 2018. View at: Publisher Site | Google Scholar
  191. A. L. Thompson, K. R. Thorp, M. M. Conley et al., “Comparing nadir and multi-angle view sensor technologies for measuring in-field plant height of upland cotton,” Remote Sensing, vol. 11, p. 700, 2019. View at: Publisher Site | Google Scholar
  192. X. Wang, R. Zhang, W. Song et al., “Dynamic plant height QTL revealed in maize through remote sensing phenotyping using a high-throughput unmanned aerial vehicle (UAV),” Scientific Reports, vol. 9, p. 3458, 2019. View at: Publisher Site | Google Scholar
  193. N. Wilke, B. Siegmann, L. Klingbeil et al., “Quantifying lodging percentage and lodging severity using a UAV-based canopy height model combined with an objective threshold approach,” Remote Sensing, vol. 11, p. 515, 2019. View at: Publisher Site | Google Scholar
  194. T. Liu, R. Li, X. Jin et al., “Evaluation of seed emergence uniformity of mechanically sown wheat with UAV RGB imagery,” Remote Sensing, vol. 9, p. 1241, 2017. View at: Publisher Site | Google Scholar
  195. Z. Khan, J. Chopin, J. Cai, V.-R. Eichi, S. Haefele, and S. J. Miklavcic, “Quantitative estimation of wheat phenotyping traits using ground and aerial imagery,” Remote Sensing, vol. 10, p. 950, 2018. View at: Publisher Site | Google Scholar
  196. M. A. Hassan, M. Yang, L. Fu et al., “Accuracy assessment of plant height using an unmanned aerial vehicle for quantitative genomic analysis in bread wheat,” Plant Methods, vol. 15, p. 37, 2019. View at: Publisher Site | Google Scholar
  197. L. Wan, Y. Li, H. Cen et al., “Combining UAV-based vegetation indices and image classification to estimate flower number in oilseed rape,” Remote Sensing, vol. 10, p. 1484, 2018. View at: Publisher Site | Google Scholar
  198. E. C. Tetila, B. B. Machado, N. A. de Souza Belete, D. A. Guimaraes, and H. Pistori, “Identification of soybean foliar diseases using unmanned aerial vehicle images,” IEEE Geoscience and Remote Sensing Letters, vol. 14, pp. 2190–2194, 2017. View at: Publisher Site | Google Scholar
  199. K. Johansen, T. Raharjo, and M. F. McCabe, “Using multi-spectral UAV imagery to extract tree crop structural properties and assess pruning effects,” Remote Sensing, vol. 10, p. 854, 2018. View at: Publisher Site | Google Scholar
  200. L. Han, G. Yang, H. Feng et al., “Quantitative identification of maize lodging-causing feature factors using unmanned aerial vehicle images and a nomogram computation,” Remote Sensing, vol. 10, p. 1528, 2018. View at: Publisher Site | Google Scholar
  201. S. Joalland, C. Screpanti, H. V. Varella et al., “Aerial and ground based sensing of tolerance to beet cyst nematode in sugar beet,” Remote Sensing, vol. 10, p. 787, 2018. View at: Publisher Site | Google Scholar
  202. A. Patrick, S. Pelham, A. Culbreath, C. Corely Holbrook, I. J. De Godoy, and C. Li, “High throughput phenotyping of tomato spot wilt disease in peanuts using unmanned aerial systems and multispectral imaging,” IEEE Instrumentation and Measurement Magazine, vol. 20, pp. 4–12, 2017. View at: Publisher Site | Google Scholar
  203. R. Xu, C. Li, and A. H. Paterson, “Multispectral imaging and unmanned aerial systems for cotton plant phenotyping,” PLoS One, vol. 14, article e0205083, 2019. View at: Publisher Site | Google Scholar
  204. X. Yao, N. Wang, Y. Liu et al., “Estimation of wheat LAI at middle to high levels using unmanned aerial vehicle narrowband multispectral imagery,” Remote Sensing, vol. 9, p. 1304, 2017. View at: Publisher Site | Google Scholar
  205. J. G. Ha, H. Moon, J. T. Kwak et al., “Deep convolutional neural network for classifying Fusarium wilt of radish from unmanned aerial vehicles,” Journal of Applied Remote Sensing, vol. 11, p. 1, 2017. View at: Publisher Site | Google Scholar
  206. A. K. Singh, B. Ganapathysubramanian, S. Sarkar, and A. Singh, “Deep learning for plant stress phenotyping: trends and future perspectives,” Trends in Plant Science, vol. 23, no. 10, pp. 883–898, 2018. View at: Publisher Site | Google Scholar
  207. B. Arad, R. Timofte, O. Ben-Shahar, Y.-T. Lin, and G. D. Finlayson, Ntire 2020 challenge on spectral reconstruction from an RGB image, 2020,
  208. M. Shoeiby, A. Robles-Kelly, R. Timofte et al., “PIRM2018 challenge on spectral image super-resolution: methods and results,” in Proceedings of the European Conference on Computer Vision (ECCV) Workshops, Munich, Germany, 2018, View at: Google Scholar
  209. M. Zhang, S. Li, F. Yu, and X. Tian, “Image fusion employing adaptive spectral-spatial gradient sparse regularization in UAV remote sensing,” Signal Processing, vol. 170, p. 107434, 2020. View at: Publisher Site | Google Scholar
  210. A. Haghighattalab, L. G. Pérez, S. Mondal et al., “Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries,” Plant Methods, vol. 12, 2016. View at: Publisher Site | Google Scholar
  211. S. Brocks, J. Bendig, and G. Bareth, “Toward an automated low-cost three-dimensional crop surface monitoring system using oblique stereo imagery from consumer-grade smart cameras,” Journal of Applied Remote Sensing, vol. 10, article 046021, 2016. View at: Publisher Site | Google Scholar
  212. T. Gao, H. Emadi, H. Saha et al., “A novel multirobot system for plant phenotyping,” Robotics, vol. 7, p. 61, 2018. View at: Publisher Site | Google Scholar
  213. K. Nagasubramanian, T. Z. Jubery, F. F. Ardakani et al., “How useful is active learning for image-based plant phenotyping?” 2020, View at: Google Scholar
  214. G. Chmaj and H. Selvaraj, “Distributed processing applications for UAV/drones: a survey,” in Progress in Systems Engineering, pp. 449–454, Springer International Publishing, 2015. View at: Publisher Site | Google Scholar
  215. M. Campion, P. Ranganathan, and S. Faruque, “UAV swarm communication and control architectures: a review,” Journal of Unmanned Vehicle Systems, vol. 7, pp. 93–106, 2019. View at: Publisher Site | Google Scholar
  216. OSU’s USRI receives first FAA authorization to fly unmanned aircraft in swarms, December 2020,

Copyright © 2021 Wei Guo et al. Exclusive Licensee Nanjing Agricultural University. Distributed under a Creative Commons Attribution License (CC BY 4.0).

 PDF Download Citation Citation
Altmetric Score