Gasparino, Mateus V.
Higuti, Vitor A. H.
Velasquez, Andres E. B.
Total Authors: 4
 Univ Illinois, Dept Agr & Biol Engn, Champaign, IL 61820 - USA
 Univ Sao Paulo, Dept Mech Engn, Sao Carlos, SP - Brazil
Total Affiliations: 2
Journal of the Brazilian Society of Mechanical Sciences and Engineering;
OCT 21 2020.
Web of Science Citations:
Small robotic vehicles have been navigating agricultural fields in the pursuit of new possibilities to increase agricultural production and to meet the increasing food and energetic demands. However, a perception system with reliable awareness of the surroundings remains a challenge to achieve autonomous navigation. Camera and single-layer laser scanners have been the primary sources of information, yet the first suffers from outdoor light sensibility and both from occlusion by leaves. This paper describes a three-dimensional system acquisition for corn crops. The sensing core is a single-layer UTM30-LX laser scanner rotating around its axis, while an inertial sensor provides angular measurements. With the rotation, multiple layers are used to compose a 3D point cloud, which is represented by a two-dimensional occupancy grid. Each cell is filled according to the number of readings, and their weights derive from two procedures: firstly, a mask enhances vertical entities (stalks); secondly, two Gaussian functions on the expected position of the immediate neighboring rows weaken readings in the middle of the lane and farther rows. The resulting occupancy grid allows the representation of the cornrows by virtual walls, which are used as references to a wall follower algorithm. According to experimental results, the virtual walls are segmented with reduced influence from straying leaves and sparse weeds when compared to the segmentation done with single-layer laser scanner data. Indeed, 64.02% of 3D outputs are within 0.05 m limit error from expected lane width, while only 11.63% of single-layer laser data are within same limit. (AU)