z-logo
open-access-imgOpen Access
Lightweight Real-Time Segmentation for Agricultural Environments
Author(s) -
Na Yeon Bae,
Dong Seog Han
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3614053
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
The application of autonomous driving in agriculture poses unique challenges compared to urban autonomous vehicles. Unlike structured roads with clear signals, agricultural environments feature unstructured, irregular terrains requiring precise perception and high-level decision-making. Current autonomous agricultural machines primarily rely on LiDAR, which detects object distance and shape with high accuracy. However, LiDAR systems are computationally expensive due to large data volumes and complex processing. This paper proposes dilated group convolution network (DG-Net), a lightweight deep learning model for real-time farmland segmentation. DG-Net incorporates DG-blocks, combining dilated and group convolutions, and an upsampling module using pixel shuffle. DG-blocks expand the receptive field to extract contextual features, which are connected to the decoder via skip connections. The decoder restores resolution using pixel shuffle by rearranging low-resolution feature maps into high-resolution outputs. DG-Net achieves a pixel accuracy of 75.1% and a mIoU of 77.7%, surpassing the state-of-the-art segment anything model – vision transformer base (SAM-ViT-B) by 2.5% in mean intersection over union (mIoU) while using fewer parameters and maintaining lightweight computational cost. These results demonstrate DG-Net’s applicability for efficient and accurate real-time farmland perception.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom