<?xml version="1.0" encoding="UTF-8"?>
<collection xmlns="http://www.loc.gov/MARC21/slim">
<record>
  <controlfield tag="001">147158</controlfield>
  <controlfield tag="005">20241212141913.0</controlfield>
  <datafield tag="024" ind1="7" ind2=" ">
    <subfield code="2">doi</subfield>
    <subfield code="a">10.3389/frvir.2023.1286689</subfield>
  </datafield>
  <datafield tag="024" ind1="8" ind2=" ">
    <subfield code="2">sideral</subfield>
    <subfield code="a">140946</subfield>
  </datafield>
  <datafield tag="037" ind1=" " ind2=" ">
    <subfield code="a">ART-2023-140946</subfield>
  </datafield>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="a">Chamizo, Victoria D.</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">Editorial: From paper and pencil tasks to virtual reality interventions: improving spatial abilities in girls and women</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2023</subfield>
  </datafield>
  <datafield tag="520" ind1="3" ind2=" ">
    <subfield code="a">Understanding scientific findings in the context of sex -whether similarities, differences, or complex nuances-is critical to appropriately applying research-derived knowledge to achieve our multiple goals (Clayton, 2016). When discussing spatial abilities or spatial cognition, it is common to acknowledge that it differs between men and women, as well as in other species, with males frequently outperforming females in various spatial tasks (for reviews see Chamizo and Rodrigo, 2019; Halpern, 2012; Kimura, 1999; Mackintosh, 1998; Voyer et al., 1995). Despite being perceived as “politically incorrect” this assertion is supported by substantial evidence from studies with both human and non-human participants. Geary (2021) has suggested that the difference could be attributed, at least partly, to the fact that males tend to use distant landmarks to orient themselves while navigating towards a goal, whereas females may not employ this strategy with comparable frequency. This statement aligns with the range size hypothesis, which is the best predictor of sex differences with non-human participants, a biological hypothesis that predicts sex differences based on the size of the territory covered throughout life (for humans, seeVashro et al., 2016). This hypothesis connects with our ancestors’ past as hunter-gatherers and in other mammals with polygyny, in which promiscuous males mate with multiple females in a breeding session, thereby having a larger home range than females (which is not the case with monogamous males). Despite what has just been said, it is important to note that the differences between men and women in many spatial tasks, such as mental rotation, can sometimes disappear depending on several factors (Jansen-Osmann and Heil, 2007; Hegarty, 2018; Ruthsatz et al., 2019; Álvarez-Vargas et al., 2020; Jost and Jansen, 2023). Future research will have to clarify these apparent inconsistencies..</subfield>
  </datafield>
  <datafield tag="506" ind1="0" ind2=" ">
    <subfield code="a">Access copy available to the general public</subfield>
    <subfield code="f">Unrestricted</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="9">info:eu-repo/semantics/openAccess</subfield>
    <subfield code="a">by</subfield>
    <subfield code="u">http://creativecommons.org/licenses/by/3.0/es/</subfield>
  </datafield>
  <datafield tag="592" ind1=" " ind2=" ">
    <subfield code="a">0.803</subfield>
    <subfield code="b">2023</subfield>
  </datafield>
  <datafield tag="593" ind1=" " ind2=" ">
    <subfield code="a">Computer Graphics and Computer-Aided Design</subfield>
    <subfield code="c">2023</subfield>
    <subfield code="d">Q2</subfield>
  </datafield>
  <datafield tag="593" ind1=" " ind2=" ">
    <subfield code="a">Human-Computer Interaction</subfield>
    <subfield code="c">2023</subfield>
    <subfield code="d">Q2</subfield>
  </datafield>
  <datafield tag="593" ind1=" " ind2=" ">
    <subfield code="a">Computer Science Applications</subfield>
    <subfield code="c">2023</subfield>
    <subfield code="d">Q2</subfield>
  </datafield>
  <datafield tag="594" ind1=" " ind2=" ">
    <subfield code="a">5.8</subfield>
    <subfield code="b">2023</subfield>
  </datafield>
  <datafield tag="655" ind1=" " ind2="4">
    <subfield code="a">info:eu-repo/semantics/other</subfield>
    <subfield code="v">info:eu-repo/semantics/publishedVersion</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="a">Bourdin, Pierre</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="a">Mendez-Lopez, Magdalena</subfield>
    <subfield code="u">Universidad de Zaragoza</subfield>
    <subfield code="0">(orcid)0000-0002-4249-602X</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="a">Santamaria, Juan Jose</subfield>
  </datafield>
  <datafield tag="710" ind1="2" ind2=" ">
    <subfield code="1">4009</subfield>
    <subfield code="2">725</subfield>
    <subfield code="a">Universidad de Zaragoza</subfield>
    <subfield code="b">Dpto. Psicología y Sociología</subfield>
    <subfield code="c">Área Psicobiología</subfield>
  </datafield>
  <datafield tag="773" ind1=" " ind2=" ">
    <subfield code="g">4 (2023), 1286689 [3 pp.]</subfield>
    <subfield code="p">Front. virtual real.</subfield>
    <subfield code="t">Frontiers in Virtual Reality</subfield>
    <subfield code="x">2673-4192</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">554579</subfield>
    <subfield code="u">http://zaguan.unizar.es/record/147158/files/texto_completo.pdf</subfield>
    <subfield code="y">Versión publicada</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">2200701</subfield>
    <subfield code="u">http://zaguan.unizar.es/record/147158/files/texto_completo.jpg?subformat=icon</subfield>
    <subfield code="x">icon</subfield>
    <subfield code="y">Versión publicada</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="o">oai:zaguan.unizar.es:147158</subfield>
    <subfield code="p">articulos</subfield>
    <subfield code="p">driver</subfield>
  </datafield>
  <datafield tag="951" ind1=" " ind2=" ">
    <subfield code="a">2024-12-12-12:44:04</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">ARTICLE</subfield>
  </datafield>
</record>
</collection>