An Automated Machine Learning Framework in Unmanned Aircraft Systems: New Insights into Agricultural Management Practices Recognition Approaches

Kai-Yun Li, Niall Burnside, Raul Sampaio de Lima, Miguel Villoslada, Karli Sepp, Victor Henrique Cabral Pinheiro, University Tartu, Ming-Der Yang, Ants Vain, Kalev Sepp

Research output: Contribution to journalArticlepeer-review


The recent trend of automated machine learning (AutoML) has been driving further significant technological innovation in the application of artificial intelligence from its automated algorithm selection and hyperparameter optimization of the deployable pipeline model for unraveling substance problems. However, a current knowledge gap lies in the integration of AutoML technology and unmanned aircraft systems (UAS) within image-based data classification tasks. Therefore, we employed a state-of-the-art (SOTA) and completely open-source AutoML framework, Auto-sklearn, which was constructed based on one of the most widely used ML systems: Scikit-learn. It was combined with two novel AutoML visualization tools to focus particularly on the recognition and adoption of UAS-derived multispectral vegetation indices (VI) data across a diverse range of agricultural management practices (AMP). These include soil tillage methods (STM), cultivation methods (CM), and manure application (MA), and are under the four-crop combination fields (i.e., red clover-grass mixture, spring wheat, pea-oat mixture, and spring barley). Furthermore, they have currently not been efficiently examined and accessible parameters in UAS applications are absent for them. We conducted the comparison of AutoML performance using three other common machine learning classifiers, namely Random Forest (RF), support vector machine (SVM), and artificial neural network (ANN). The results showed AutoML achieved the highest overall classification accuracy numbers after 1200 s of calculation. RF yielded the second-best classification accuracy, and SVM and ANN were revealed to be less capable among some of the given datasets. Regarding the classification of AMPs, the best recognized period for data capture occurred in the crop vegetative growth stage (in May). The results demonstrated that CM yielded the best performance in terms of classification, followed by MA and STM. Our framework presents new insights into plant–environment interactions with capable classification capabilities. It further illustrated the automatic system would become an important tool in furthering the understanding for future sustainable smart farming and field-based crop phenotyping research across a diverse range of agricultural environmental assessment and management applications.
Original languageEnglish
Article number3190
Number of pages24
JournalRemote Sensing
Issue number16
Publication statusPublished - 12 Aug 2021

Bibliographical note

This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// Funding Information: This research was funded by the European Regional Development Fund within the Estonian National Programme for Addressing Socio-Economic Challenges through R&D (RITA): L180283PKKK and the Doctoral School of Earth Sciences and Ecology, financed by the European Union, European Regional Development Fund (Estonian University of Life Sciences ASTRA project ?Value-chain based bio-economy?)


  • unmanned aircraft system
  • automated machine learning
  • agricultural management practices
  • image classification
  • precision agriculture
  • variety performance trials
  • crop breeding
  • crop phenotyping
  • agriculture decision-making
  • Agricultural management practices
  • Unmanned aircraft system
  • Crop phenotyping
  • Variety performance trials
  • Crop breeding
  • Precision agriculture
  • Automated machine learning
  • Agriculture decision-making
  • Image classification


Dive into the research topics of 'An Automated Machine Learning Framework in Unmanned Aircraft Systems: New Insights into Agricultural Management Practices Recognition Approaches'. Together they form a unique fingerprint.

Cite this