<?xml version="1.0" encoding="UTF-8"?>
<collection>
<dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:invenio="http://invenio-software.org/elements/1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd"><dc:identifier>doi:10.3390/app122211557</dc:identifier><dc:language>eng</dc:language><dc:creator>Sainz-DeMena, Diego</dc:creator><dc:creator>García-Aznar, José Manuel</dc:creator><dc:creator>Pérez, María Ángeles</dc:creator><dc:creator>Borau, Carlos</dc:creator><dc:title>Im2mesh: A Python Library to Reconstruct 3D Meshes from Scattered Data and 2D Segmentations, Application to Patient-Specific Neuroblastoma Tumour Image Sequences</dc:title><dc:identifier>ART-2022-131087</dc:identifier><dc:description>The future of personalised medicine lies in the development of increasingly sophisticated digital twins, where the patient-specific data is fed into predictive computational models that support the decisions of clinicians on the best therapies or course actions to treat the patient’s afflictions. The development of these personalised models from image data requires a segmentation of the geometry of interest, an estimation of intermediate or missing slices, a reconstruction of the surface and generation of a volumetric mesh and the mapping of the relevant data into the reconstructed three-dimensional volume. There exist a wide number of tools, including both classical and artificial intelligence methodologies, that help to overcome the difficulties in each stage, usually relying on the combination of different software in a multistep process. In this work, we develop an all-in-one approach wrapped in a Python library called im2mesh that automatizes the whole workflow, which starts reading a clinical image and ends generating a 3D finite element mesh with the interpolated patient data. In this work, we apply this workflow to an example of a patient-specific neuroblastoma tumour. The main advantages of our tool are its straightforward use and its easy integration into broader pipelines.</dc:description><dc:date>2022</dc:date><dc:source>http://zaguan.unizar.es/record/120184</dc:source><dc:doi>10.3390/app122211557</dc:doi><dc:identifier>http://zaguan.unizar.es/record/120184</dc:identifier><dc:identifier>oai:zaguan.unizar.es:120184</dc:identifier><dc:relation>info:eu-repo/grantAgreement/EC/H2020/826494/EU/PRedictive In-silico Multiscale Analytics to support cancer personalized diaGnosis and prognosis, Empowered by imaging biomarkers/PRIMAGE</dc:relation><dc:relation>This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No H2020 826494-PRIMAGE</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MCIU/FPU18/04541</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MICINN-AEI-FEDER/PID2021-122409OB-C21</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MICINN/PID2020-113819RB-I00/AEI/10.13039/501100011033</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MICINN/PLEC2021-007709/AEI/10.13039/501100011033</dc:relation><dc:identifier.citation>Applied Sciences (Switzerland) 12, 22 (2022), 11557 [15 pp.]</dc:identifier.citation><dc:rights>by</dc:rights><dc:rights>http://creativecommons.org/licenses/by/3.0/es/</dc:rights><dc:rights>info:eu-repo/semantics/openAccess</dc:rights></dc:dc>

</collection>