Show simple item record

dc.contributor.authorNogueira Rodríguez, Alba 
dc.contributor.authorDomínguez Carbajales, Rubén
dc.contributor.authorLópez Fernández, Hugo 
dc.contributor.authorIglesias, Águeda
dc.contributor.authorCubiella Fernández, Joaquín
dc.contributor.authorFernández Riverola, Florentino 
dc.contributor.authorReboiro Jato, Miguel 
dc.contributor.authorGonzález Peña, Daniel 
dc.date.accessioned2021-12-01T12:06:23Z
dc.date.available2021-12-01T12:06:23Z
dc.date.issued2021-01
dc.identifier.citationNeurocomputing, 423, 721-734 (2021)spa
dc.identifier.issn09252312
dc.identifier.urihttp://hdl.handle.net/11093/2802
dc.description.abstractDeep Learning (DL) has attracted a lot of attention in the field of medical image analysis because of its higher performance in image classification when compared to previous state-of-the-art techniques. In addition, a recent meta-analysis found that the diagnostic performance of DL models is equivalent to that of health-care professionals. In this scenario, a lot of research using DL for polyp detection and classification have been published showing promising results in the last five years. Our work aims to review the most relevant studies from a technical point of view, focusing on the low-level details for the implementation of the DL models. To do so, this review analyzes the published research covering aspects like DL architectures, training strategies, data augmentation, transfer learning, or the features of the datasets used and their impact on the performance of the models. Additionally, comparative tables summarizing the main aspects analyzed in this review are publicly available at https://github.com/sing-group/deep-learning-colonoscopy.en
dc.description.sponsorshipXunta de Galicia | Ref. ED431C2018 / 55-GRCspa
dc.description.sponsorshipMinisterio de Economía, Industria y Competitividad | Ref. DPI2017-87494-Rspa
dc.description.sponsorshipXunta de Galicia | Ref. ED481A-2019/299spa
dc.description.sponsorshipXunta de Galicia | Ref. ED481B 2016 / 068-0spa
dc.description.sponsorshipInstituto de Salud Carlos III | Ref. PI11 / 00094spa
dc.description.sponsorshipInstituto de Salud Carlos III | Ref. PI17 / 00837spa
dc.language.isoengspa
dc.publisherNeurocomputingspa
dc.relationinfo:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2013-2016/DPI2017-87494-R/ES
dc.rightsAttribution 4.0 International
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.titleDeep neural networks approaches for detecting and classifying colorectal polypsen
dc.typearticlespa
dc.rights.accessRightsopenAccessspa
dc.identifier.doi10.1016/j.neucom.2020.02.123
dc.identifier.editorhttps://linkinghub.elsevier.com/retrieve/pii/S0925231220307359spa
dc.publisher.departamentoInformáticaspa
dc.publisher.grupoinvestigacionSistemas Informáticos de Nova Xeraciónspa
dc.subject.unesco32 Ciencias Médicasspa
dc.subject.unesco1203.04 Inteligencia Artificialspa
dc.subject.unesco1203.20 Sistemas de Control Medicospa
dc.date.updated2021-11-24T09:42:19Z
dc.computerCitationpub_title=Neurocomputing|volume=423|journal_number=|start_pag=721|end_pag=734spa
dc.referencesSING group thanks CITI (Centro de Investigación, Transferencia e Innovación) from the University of Vigo for hosting its IT infrastructure. This work was partially supported by the Consellería de Educación, Universidades e Formación Profesional (Xunta de Galicia) under the scope of the strategic funding of ED431C2018/55-GRC Competitive Reference Group and by the Ministerio de Economía, Industria y Competitividad, Gobierno de España under the scope of the PolyDeep project (DPI2017-87494-R). The authors also acknowledge the grants of Alba Nogueira-Rodríguez (predoctoral fellowship ED481A-2019/299) and Hugo López-Fernández (postdoctoral fellowship ED481B 2016/068-0), funded by the Xunta de Galicia. Joaquín Cubiella received grants from Instituto de Salud Carlos III (PI11/00094 and PI17/00837).en


Files in this item

[PDF]

    Show simple item record

    Attribution 4.0 International
    Except where otherwise noted, this item's license is described as Attribution 4.0 International