Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sci Rep ; 14(1): 16353, 2024 Jul 16.
Artigo em Inglês | MEDLINE | ID: mdl-39013975

RESUMO

Lane line images have the essential attribute of large-scale variation and complex scene information, and the similarity between adjacent lane lines is high, which can easily cause classification errors. And remote lane lines are difficult to recognize due to visual angle changes in width. To address this issue, this paper proposes an effective lane detection framework, which is a hybrid feature fusion network that enhances multiple spatial features and distinguishes key features throughout the entire lane line segment. It enhances and fuses lane line features at multiscale to enhance the feature representation of lane line images, especially at the far end. Firstly, in order to enhance the correlation of multiscale lane features, a multi-head self attention is used to construct a multi-space attention enhancement module for feature enhancement in multispace. Secondly, a spatial separable convolutional branch is designed for the jumping layer structure connecting multiscale lane line features. While retaining feature information of different scales, important lane areas in multiscale feature information are emphasized through the allocation of spatial attention weights. Finally, considering that lane lines are elongated areas in the image, and the background information in the image is much more abundant than lane line information, the flexibility of traditional pooling operations in capturing widely existing anisotropic contexts in actual environments is limited. Therefore, before embedding feature output branches, strip pooling is introduced to refine the representation of lane line information and optimize model performance. The experimental results show that the accuracy on the TuSimple dataset reaches 96.84%, and the F1 score on the CULane dataset reaches 75.9%.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...