Busca avançada
Ano de início
Entree
(Referência obtida automaticamente do Web of Science, por meio da informação sobre o financiamento pela FAPESP e o número do processo correspondente, incluída na publicação pelos autores.)

Under canopy light detection and ranging-based autonomous navigation

Texto completo
Autor(es):
Higuti, Vitor A. H. [1] ; Velasquez, Andres E. B. [1] ; Magalhaes, Daniel Varela [1] ; Becker, Marcelo [1] ; Chowdhary, Girish [2]
Número total de Autores: 5
Afiliação do(s) autor(es):
[1] Univ Sao Paulo, Mech Engn Dept, BR-13566590 Sao Carlos, SP - Brazil
[2] Univ Illinois, CSL, Agr & Biol Engn, Urbana, IL - USA
Número total de Afiliações: 2
Tipo de documento: Artigo Científico
Fonte: Journal of Field Robotics; v. 36, n. 3, p. 547-567, MAY 2019.
Citações Web of Science: 3
Resumo

This paper describes a light detection and ranging (LiDAR)-based autonomous navigation system for an ultralightweight ground robot in agricultural fields. The system is designed for reliable navigation under cluttered canopies using only a 2D Hokuyo UTM-30LX LiDAR sensor as the single source for perception. Its purpose is to ensure that the robot can navigate through rows of crops without damaging the plants in narrow row-based and high-leaf-cover semistructured crop plantations, such as corn (Zea mays) and sorghum (Sorghum bicolor). The key contribution of our work is a LiDAR-based navigation algorithm capable of rejecting outlying measurements in the point cloud due to plants in adjacent rows, low-hanging leaf cover or weeds. The algorithm addresses this challenge using a set of heuristics that are designed to filter out outlying measurements in a computationally efficient manner, and linear least squares are applied to estimate within-row distance using the filtered data. Moreover, a crucial step is the estimate validation, which is achieved through a heuristic that grades and validates the fitted row-lines based on current and previous information. The proposed LiDAR-based perception subsystem has been extensively tested in production/breeding corn and sorghum fields. In such variety of highly cluttered real field environments, the robot logged more than 6 km of autonomous run in straight rows. These results demonstrate highly promising advances to LiDAR-based navigation in realistic field environments for small under-canopy robots. (AU)

Processo FAPESP: 13/07276-1 - CEPOF - Centro de Pesquisa em Óptica e Fotônica
Beneficiário:Vanderlei Salvador Bagnato
Modalidade de apoio: Auxílio à Pesquisa - Centros de Pesquisa, Inovação e Difusão - CEPIDs
Processo FAPESP: 17/00033-7 - Sistema de navegação baseada em lidar para robô móvel terrestre autônomo em cultura de milho
Beneficiário:Vitor Akihiro Hisano Higuti
Modalidade de apoio: Bolsas no Exterior - Estágio de Pesquisa - Mestrado