Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Heliyon ; 10(11): e31730, 2024 Jun 15.
Article in English | MEDLINE | ID: mdl-38841473

ABSTRACT

Identifying plantation lines in aerial images of agricultural landscapes is re-quired for many automatic farming processes. Deep learning-based networks are among the most prominent methods to learn such patterns and extract this type of information from diverse imagery conditions. However, even state-of-the-art methods may stumble in complex plantation patterns. Here, we propose a deep learning approach based on graphs to detect plantation lines in UAV-based RGB imagery, presenting a challenging scenario containing spaced plants. The first module of our method extracts a feature map throughout the backbone, which consists of the initial layers of the VGG16. This feature map is used as an input to the Knowledge Estimation Module (KEM), organized in three concatenated branches for detecting 1) the plant positions, 2) the plantation lines, and 3) the displacement vectors between the plants. A graph modeling is applied considering each plant position on the image as vertices, and edges are formed between two vertices (i.e. plants). Finally, the edge is classified as pertaining to a certain plantation line based on three probabilities (higher than 0.5): i) in visual features obtained from the backbone; ii) a chance that the edge pixels belong to a line, from the KEM step; and iii) an alignment of the displacement vectors with the edge, also from the KEM step. Experiments were conducted initially in corn plantations with different growth stages and patterns with aerial RGB imagery to present the advantages of adopting each module. We assessed the generalization capability in the other two cultures (orange and eucalyptus) datasets. The proposed method was compared against state-of-the-art deep learning methods and achieved superior performance with a significant margin considering all three datasets. This approach is useful in extracting lines with spaced plantation patterns and could be implemented in scenarios where plantation gaps occur, generating lines with few-to-no interruptions.

2.
IEEE Trans Cybern ; 50(2): 777-786, 2020 Feb.
Article in English | MEDLINE | ID: mdl-30334778

ABSTRACT

Texture analysis has attracted increasing attention in computer vision due to its power in describing images and the physical properties of objects. Among the methods for texture analysis, complex network (CN)-based ones have emerged to model images because of their flexibility. In image modeling, each pixel is mapped to a vertex of the CN and two vertices are connected if they are spatially close in the image. Then measurements are extracted from the CN to characterize its topology and therefore characterize the image content. Despite the promising results, the accuracy of these methods depends on the suitability of the measurement for the application. In texture analysis, simple measurements have been used, such as those based on vertex degree and shortest paths. Motivated by these issues, this paper proposes a new method for texture analysis based on the CN and a new measurement that calculates the importance of each vertex within its neighborhood. For calculating the importance of vertices, we extend the pagerank to CN in order to correlate the vertex importance with its degree and show that this new measurement extracts texture properties. Experimental results on well-known datasets and in the recognition of soybean diseases using leaf texture show the effectiveness of the proposed method for texture recognition.

SELECTION OF CITATIONS
SEARCH DETAIL
...