000100710 001__ 100710
000100710 005__ 20230519145407.0
000100710 0247_ $$2doi$$a10.1167/jov.21.2.2
000100710 0248_ $$2sideral$$a122751
000100710 037__ $$aART-2021-122751
000100710 041__ $$aeng
000100710 100__ $$aLagunas, Manuel$$uUniversidad de Zaragoza
000100710 245__ $$aThe joint role of geometry and illumination on material recognition
000100710 260__ $$c2021
000100710 5060_ $$aAccess copy available to the general public$$fUnrestricted
000100710 5203_ $$aObserving and recognizing materials is a fundamental part of our daily life. Under typical viewing conditions, we are capable of effortlessly identifying the objects that surround us and recognizing the materials they are made of. Nevertheless, understanding the underlying perceptual processes that take place to accurately discern the visual properties of an object is a long-standing problem. In this work, we perform a comprehensive and systematic analysis of how the interplay of geometry, illumination, and their spatial frequencies affects human performance on material recognition tasks. We carry out large-scale behavioral experiments where participants are asked to recognize different reference materials among a pool of candidate samples. In the different experiments, we carefully sample the information in the frequency domain of the stimuli. From our analysis, we find significant first-order interactions between the geometry and the illumination, of both the reference and the candidates. In addition, we observe that simple image statistics and higher-order image histograms do not correlate with human performance. Therefore, we perform a high-level comparison of highly nonlinear statistics by training a deep neural network on material recognition tasks. Our results show that such models can accurately classify materials, which suggests that they are capable of defining a meaningful representation of material appearance from labeled proximal image data. Last, we find preliminary evidence that these highly nonlinear models and humans may use similar high-level factors for material recognition tasks.
000100710 536__ $$9info:eu-repo/grantAgreement/EC/H2020/682080/EU/Intuitive editing of visual appearance from real-world datasets/CHAMELEON$$9This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No H2020 682080-CHAMELEON$$9info:eu-repo/grantAgreement/EC/H2020/765121/EU/DyViTo: Dynamics in Vision and Touch - the look and feel of stuff/DyViTo$$9This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No H2020 765121-DyViTo$$9info:eu-repo/grantAgreement/EC/H2020/956585/EU/Predictive Rendering In Manufacture and Engineering/PRIME$$9This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No H2020 956585-PRIME$$9info:eu-repo/grantAgreement/ES/MINECO/PID2019-105004GB-I00$$9info:eu-repo/grantAgreement/ES/MINECO/TIN2016-78753-P
000100710 540__ $$9info:eu-repo/semantics/openAccess$$aby-nc-nd$$uhttp://creativecommons.org/licenses/by-nc-nd/3.0/es/
000100710 590__ $$a2.004$$b2021
000100710 592__ $$a0.79$$b2021
000100710 594__ $$a3.2$$b2021
000100710 591__ $$aOPHTHALMOLOGY$$b44 / 62 = 0.71$$c2021$$dQ3$$eT3
000100710 593__ $$aSensory Systems$$c2021$$dQ2
000100710 593__ $$aOphthalmology$$c2021$$dQ2
000100710 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion
000100710 700__ $$0(orcid)0000-0002-7796-3177$$aSerrano, Ana$$uUniversidad de Zaragoza
000100710 700__ $$0(orcid)0000-0002-7503-7022$$aGutiérrez, Diego$$uUniversidad de Zaragoza
000100710 700__ $$0(orcid)0000-0003-0060-7278$$aMasiá, Belén$$uUniversidad de Zaragoza
000100710 7102_ $$15007$$2570$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Lenguajes y Sistemas Inf.
000100710 773__ $$g21, 2 (2021), 2[18 pp.]$$pJ. Vision$$tJournal of Vision$$x1534-7362
000100710 8564_ $$s2670449$$uhttps://zaguan.unizar.es/record/100710/files/texto_completo.pdf$$yVersión publicada
000100710 8564_ $$s3236255$$uhttps://zaguan.unizar.es/record/100710/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada
000100710 909CO $$ooai:zaguan.unizar.es:100710$$particulos$$pdriver
000100710 951__ $$a2023-05-18-13:48:35
000100710 980__ $$aARTICLE