000101500 001__ 101500
000101500 005__ 20210507085643.0
000101500 0247_ $$2doi$$a10.1109/IROS.2018.8594185
000101500 0248_ $$2sideral$$a107640
000101500 037__ $$aART-2018-107640
000101500 041__ $$aeng
000101500 100__ $$0(orcid)0000-0003-4638-4655$$aAlonso Ruiz, Iñigo$$uUniversidad de Zaragoza
000101500 245__ $$aSemantic Segmentation from Sparse Labeling Using Multi-Level Superpixels
000101500 260__ $$c2018
000101500 5060_ $$aAccess copy available to the general public$$fUnrestricted
000101500 5203_ $$aSemantic  segmentation  is  a  challenging  problemthat  can  benefit  numerous  robotics  applications,  since  it  pro-vides   information   about   the   content   at   every   image   pixel.Solutions  to  this  problem  have  recently  witnessed  a  boost  onperformance  and  results  thanks  to  deep  learning  approaches.Unfortunately,   common   deep   learning   models   for   semanticsegmentation  present  several  challenges  which  hinder  real  lifeapplicability  in  many  domains.  A  significant  challenge  is  theneed   of   pixel   level   labeling   on   large   amounts   of   trainingimages  to  be  able  to  train  those  models,  which  implies  avery  high  cost.  This  work  proposes  and  validates  a  simplebut  effective  approach  to  train  dense  semantic  segmentationmodels  from  sparsely  labeled  data.  Labeling  only  a  few  pixelsper  image  reduces  the  human  interaction  required.  We  findmany available datasets, e.g., environment monitoring data, thatprovide  this  kind  of  sparse  labeling.  Our  approach  is  basedon  augmenting  the  sparse  annotation  to  a  dense  one  with  theproposed  adaptive  superpixel  segmentation  propagation.  Weshow that this label augmentation enables effective learning ofstate-of-the-art  segmentation  models,  getting  similar  results  tothose models trained with dense ground-truth.
000101500 536__ $$9info:eu-repo/grantAgreement/ES/DGA/T45-17R$$9info:eu-repo/grantAgreement/ES/MINECO-FEDER/DPI2015-69376-R$$9info:eu-repo/grantAgreement/ES/UZ/UZ2017-TEC-06
000101500 540__ $$9info:eu-repo/semantics/openAccess$$aby$$uhttp://creativecommons.org/licenses/by/3.0/es/
000101500 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion
000101500 700__ $$0(orcid)0000-0002-7580-9037$$aMurillo Arnal, Ana Cristina$$uUniversidad de Zaragoza
000101500 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000101500 773__ $$g2018, 18401073 (2018), 5785-5792$$pProc. IEEE/RSJ Int. Conf. Intell. Rob. Syst.$$tProceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems$$x2153-0858
000101500 8564_ $$s3286918$$uhttps://zaguan.unizar.es/record/101500/files/texto_completo.pdf$$yPostprint
000101500 8564_ $$s3109547$$uhttps://zaguan.unizar.es/record/101500/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint
000101500 909CO $$ooai:zaguan.unizar.es:101500$$particulos$$pdriver
000101500 951__ $$a2021-05-07-08:04:19
000101500 980__ $$aARTICLE