Show simple item record

dc.contributor.authorSánchez Cubillo, Javier
dc.contributor.authorDel Ser, Javier ORCID
dc.contributor.authorMartín González, José Luis ORCID
dc.date.accessioned2024-06-27T16:35:13Z
dc.date.available2024-06-27T16:35:13Z
dc.date.issued2024-06-07
dc.identifier.citationSensors 24(12) : (2024) // Article ID 3721es_ES
dc.identifier.issn1424-8220
dc.identifier.urihttp://hdl.handle.net/10810/68692
dc.description.abstractRobotic inspection is advancing in performance capabilities and is now being considered for industrial applications beyond laboratory experiments. As industries increasingly rely on complex machinery, pipelines, and structures, the need for precise and reliable inspection methods becomes paramount to ensure operational integrity and mitigate risks. AI-assisted autonomous mobile robots offer the potential to automate inspection processes, reduce human error, and provide real-time insights into asset conditions. A primary concern is the necessity to validate the performance of these systems under real-world conditions. While laboratory tests and simulations can provide valuable insights, the true efficacy of AI algorithms and robotic platforms can only be determined through rigorous field testing and validation. This paper aligns with this need by evaluating the performance of one-stage models for object detection in tasks that support and enhance the perception capabilities of autonomous mobile robots. The evaluation addresses both the execution of assigned tasks and the robot’s own navigation. Our benchmark of classification models for robotic inspection considers three real-world transportation and logistics use cases, as well as several generations of the well-known YOLO architecture. The performance results from field tests using real robotic devices equipped with such object detection capabilities are promising, and expose the enormous potential and actionability of autonomous robotic systems for fully automated inspection and maintenance in open-world settings.es_ES
dc.description.sponsorshipThe authors of this research acknowledge indirect (cascade-funding) financial support from the European ESMERA project (grant agreement 780265) for the rail inspection robot and from the GALATEA project (grant agreement 873026) for the maritime port inspection robot. J. Del Ser received funding support from the Basque Government through ELKARTEK grants and the consolidated research group MATHMODE (IT1456-22). J. L. Martin received support from the Department of Education of the Basque Government via funds for research groups of the Basque University system (ref. IT1440-22).es_ES
dc.language.isoenges_ES
dc.publisherMDPIes_ES
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/780265es_ES
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/873026es_ES
dc.rightsinfo:eu-repo/semantics/openAccesses_ES
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/es/
dc.subjectautonomous mobile robotes_ES
dc.subjectautonomous guided vehicleses_ES
dc.subjectartificial intelligencees_ES
dc.subjectobject detectiones_ES
dc.titleToward fully automated inspection of critical assets supported by autonomous mobile robots, vision sensors, and artificial intelligencees_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.date.updated2024-06-26T13:24:09Z
dc.rights.holder© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/ 4.0/).es_ES
dc.relation.publisherversionhttps://www.mdpi.com/1424-8220/24/12/3721es_ES
dc.identifier.doi10.3390/s24123721
dc.contributor.funderEuropean Commission
dc.departamentoesTecnología electrónica
dc.departamentoesIngeniería de comunicaciones
dc.departamentoeuTeknologia elektronikoa
dc.departamentoeuKomunikazioen ingeniaritza


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/ 4.0/).
Except where otherwise noted, this item's license is described as © 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/ 4.0/).