The accurate acquisition of the spatial distribution of grape planting regions from remote sensing imagery is of great significance for optimizing the layout of grape planting regions and promoting the structural adjustment of grape industry. Due to the problems of the large differences in the size, unfixed spectral characteristics and complex background environment of the objects, it brings many challenges to accurate crop remote sensing recognition. In order to improve the accuracy of crop remote sensing recognition, a pixel-level accurate recognition method was proposed for grape planting regions based on the GF-2 satellite remote sensing imagery and the U-Net model was taken as the basic skeleton. The main improvements to U-Net were recalibrating the feature maps separately along channel and space adaptively, to boost meaningful features and improve the accuracy of edge segmentation, while suppressing weak ones, and reducing the number of downsampling and using hybrid dilated convolution instead of conventional convolution operation, to cut down the loss of image resolution and improve the recognition of objects of different shapes and sizes. The experiments showed that the pixel accuracy, mean intersection over union (MIoU), and frequency weighted intersection over union (FWIoU) of the model on the test set were 96.56%, 93.11% and 93.35%, respectively, which were 5.17 percentage points, 9.57 percentage points and 9.17 percentage points higher than those of the FCN-8s model, and 2.39 percentage points, 4.59 percentage points and 4.39 percentage points better than those of the original U-Net model. In addition, the impacts of the attention modules and hybrid dilated convolution on this model were analyzed through ablation experiments. The proposed model was simple with few parameters, capable of identifying different sizes of grape planting regions with fine edge segmentation effect, and it can provide an effective way to improve the accuracy of crop remote sensing recognition. © 2022, Chinese Society of Agricultural Machinery. All right reserved.