Show simple item record

dc.contributor.authorGutiérrez Zaballa, Jon
dc.contributor.authorBasterrechea Oyarzabal, Koldobika
dc.contributor.authorEchanove Arias, Francisco Javier ORCID
dc.contributor.authorMartínez González, María Victoria
dc.contributor.authorDel Campo Hagelstrom, Inés Juliana ORCID
dc.date.accessioned2025-02-19T18:30:11Z
dc.date.available2025-02-19T18:30:11Z
dc.date.issued2022-07-30
dc.identifier.citationDesign and Architecture for Signal and Image Processing: 15th International Workshop, DASIP 2022, Budapest, Hungary, June 20–22, 2022, Proceedings : 136-148 (2022)es_ES
dc.identifier.isbn978-3-031-12748-9
dc.identifier.urihttp://hdl.handle.net/10810/72841
dc.description.abstractAdvanced Driver Assistance Systems (ADAS) are designed with the main purpose of increasing the safety and comfort of vehicle occupants. Most of current computer vision-based ADAS perform detection and tracking tasks quite successfully under regular conditions, but are not completely reliable, particularly under adverse weather and changing lighting conditions, neither in complex situations with many overlapping objects. In this work we explore the use of hyperspectral imaging (HSI) in ADAS on the assumption that the distinct near infrared (NIR) spectral reflectances of different materials can help to better separate the objects in a driving scene. In particular, this paper describes some experimental results of the application of fully convolutional networks (FCN) to the image segmentation of HSI for ADAS applications. More specifically, our aim is to investigate to what extent the spatial features codified by convolutional filters can be helpful to improve the performance of HSI segmentation systems. With that aim, we use the HSI-Drive v1.1 dataset, which provides a set of labelled images recorded in real driving conditions with a small-size snapshot NIR-HSI camera. Finally, we analyze the implementability of such a HSI segmentation system by prototyping the developed FCN model together with the necessary hyperspectral cube preprocessing stage and characterizing its performance on an MPSoC.es_ES
dc.description.sponsorshipThis work was partially supported by the Basque Government under grants PIBA-2018-1-0054, KK-2021/00111 and PRE 2021 1 0113 and by the Spanish Ministry of Science and Innovation under grant PID2020-115375RB-I00. We thank the University of the Basque Country for allocation of computational resources.es_ES
dc.language.isoenges_ES
dc.publisherSpringeres_ES
dc.relationinfo:eu-repo/grantAgreement/MCIN/PID2020-115375RB-I00es_ES
dc.rightsinfo:eu-repo/semantics/openAccesses_ES
dc.subjecthyperspectral imaginges_ES
dc.subjectscene understandinges_ES
dc.subjectfully convolutional networkses_ES
dc.subjectautonomous driving systemses_ES
dc.subjectsystem on chipes_ES
dc.titleExploring fully convolutional networks for the segmentation of hyperspectral imaging applied to advanced driver assistance systemses_ES
dc.typeinfo:eu-repo/semantics/conferenceObjectes_ES
dc.rights.holder© 2022 Springer Nature Switzerland AGes_ES
dc.relation.publisherversionhttps://doi.org/10.1007/978-3-031-12748-9_11es_ES
dc.identifier.doi10.1007/978-3-031-12748-9_11
dc.departamentoesTecnología electrónicaes_ES
dc.departamentoeuTeknologia elektronikoaes_ES


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record