Advanced search
Start date
Betweenand

Feasibility analysis of a system with Computer Vision and Artificial Intelligence in the cloud for monitoring the pest Diatraea saccharalis in sugarcane plantations

Grant number: 24/12238-6
Support Opportunities:Research Grants - Innovative Research in Small Business - PIPE
Start date: June 01, 2025
End date: February 28, 2026
Field of knowledge:Physical Sciences and Mathematics - Computer Science - Computing Methodologies and Techniques
Principal Investigator:Diego Rafael Moraes
Grantee:Diego Rafael Moraes
Company:MORAES E DE GROOTE CENTRO DE IA E VISAO COMPUTACIONAL LTDA
CNAE: Cultivo de cana-de-açúcar
Atividades de apoio à agricultura
City: Sertãozinho
Pesquisadores principais:
Jean-Jacques Georges Soares De Groote
Associated scholarship(s):25/10196-7 - Feasibility analysis of a system with Computer Vision and Artificial Intelligence in the cloud for monitoring the pest Diatraea saccharalis in sugarcane plantations, BP.PIPE

Abstract

A widely employed strategy to assess infestation involves the installation of traps, designed with adhesive papers and a compartment for storing females, to attract males, given that artificial pheromones for this species do not yet exist. The process of recording captured moths in the traps is done manually, with agents collecting each trap and noting the number of insects found. This operating method poses difficulties for producers due to the need for personnel training for accurate recording of insect incidence. The process is time-consuming and requires careful attention since incidence can involve up to 100 moths per trap, along with other factors complicating identification, such as the presence of other insects, including moths from other species. Additionally, ant attacks are frequent, leaving moths partially consumed. In this case, it is also necessary to account for the wings that remain attached to the trap. These factors make manual recording, the most commonly used method, an obstacle to standardizing moth estimates, inspector training time, and report recording and preparation, thus delaying pest control actions. This project proposes the development of a system that optimizes each step of the moth recording and counting process without substantially impacting the costs incurred by producers. The process begins with agents capturing images in the field using mobile devices, even in situations with intermittent or absent signals, whether Wi-Fi or mobile networks. The trap images and data are sent to cloud servers, where moth identification is performed using state-of-the-art Convolutional Neural Networks. The results are automatically transformed into reports and spreadsheets. The process aims not only to standardize and streamline the counting system but also to store images for audit purposes, increasing accuracy and significantly reducing the time between trap inspection and report generation, allowing for quicker, more accurate action in pest control. To validate the proposed process, especially considering the need to train artificial neural networks with real images, IA Sense sought a partnership for field tests. The company BioSolution will provide access to its grid of traps already deployed in sugarcane fields, currently covering 70,000 hectares distributed in GO, SP, MA, and BA. The use of the proposed system will substantially increase the quality and confidence of moth detection processes, allowing trap providers to enter the precision agriculture line, expanding their market. For producers, the solution has the potential to streamline pest management, save resources, and also contribute to reducing environmental damage by efficiently providing necessary information to more accurately estimate the quantity of pesticides applied to crops. The entire system is designed to be easily scalable, and methods can be updated with ease as computer vision systems evolve. (AU)

Articles published in Agência FAPESP Newsletter about the research grant:
More itemsLess items
Articles published in other media outlets ( ):
More itemsLess items
VEICULO: TITULO (DATA)
VEICULO: TITULO (DATA)