000099257 001__ 99257 000099257 005__ 20211216131136.0 000099257 0247_ $$2doi$$a10.1109/ACCESS.2020.3043256 000099257 0248_ $$2sideral$$a122253 000099257 037__ $$aART-2020-122253 000099257 041__ $$aeng 000099257 100__ $$aSagues-Tanco, R. 000099257 245__ $$aFast Synthetic Dataset for Kitchen Object Segmentation in Deep Learning 000099257 260__ $$c2020 000099257 5060_ $$aAccess copy available to the general public$$fUnrestricted 000099257 5203_ $$aObject recognition has been widely investigated in computer vision for many years. Currently, this process is carried out through neural networks, but there are very few public datasets available with mask and class labels of the objects for the training process in usual applications. In this paper, we address the problem of fast generation of synthetic datasets to train neural models because creating a handcraft labeled dataset with object segmentation is a very tedious and time-consuming task. We propose an efficient method to generate a synthetic labeled dataset that adequately combines background images with foreground segmented objects. The synthetic images can be created automatically with random positioning of the objects or, alternatively, the method can produce realistic images by keeping the realism in the scales and positions of the objects. Then, we employ Mask-RCNN deep learning model, to detect and segment classes of kitchen objects using images. In the experimental evaluation, we study both synthetic datasets, automatic or realistic, and we compare the results. We analyze the performance with the most widely used indexes and check that the realistic synthetic dataset, quickly created through our method, can provide competitive results and accurately classify the different objects. 000099257 536__ $$9info:eu-repo/grantAgreement/ES/MICINN-AEI-FEDER/RTC-2017-5965-6 000099257 540__ $$9info:eu-repo/semantics/openAccess$$aby-nc-nd$$uhttp://creativecommons.org/licenses/by-nc-nd/3.0/es/ 000099257 590__ $$a3.367$$b2020 000099257 591__ $$aCOMPUTER SCIENCE, INFORMATION SYSTEMS$$b65 / 162 = 0.401$$c2020$$dQ2$$eT2 000099257 591__ $$aTELECOMMUNICATIONS$$b36 / 91 = 0.396$$c2020$$dQ2$$eT2 000099257 591__ $$aENGINEERING, ELECTRICAL & ELECTRONIC$$b94 / 273 = 0.344$$c2020$$dQ2$$eT2 000099257 592__ $$a0.586$$b2020 000099257 593__ $$aComputer Science (miscellaneous)$$c2020$$dQ1 000099257 593__ $$aMaterials Science (miscellaneous)$$c2020$$dQ1 000099257 593__ $$aEngineering (miscellaneous)$$c2020$$dQ1 000099257 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion 000099257 700__ $$aBenages-Pardo, L. 000099257 700__ $$0(orcid)0000-0001-9347-5969$$aLopez-Nicolas, G.$$uUniversidad de Zaragoza 000099257 700__ $$0(orcid)0000-0003-4609-1254$$aLlorente, S. 000099257 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát. 000099257 773__ $$g8 (2020), 220496-220506$$pIEEE Access$$tIEEE Access$$x2169-3536 000099257 8564_ $$s1056761$$uhttps://zaguan.unizar.es/record/99257/files/texto_completo.pdf$$yVersión publicada 000099257 8564_ $$s2656944$$uhttps://zaguan.unizar.es/record/99257/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada 000099257 909CO $$ooai:zaguan.unizar.es:99257$$particulos$$pdriver 000099257 951__ $$a2021-12-16-13:05:14 000099257 980__ $$aARTICLE