Resumen: The capacity to integrate information is a prominent feature of biological, neural, and cognitive processes. Integrated Information Theory (IIT) provides mathematical tools for quantifying the level of integration in a system, but its computational cost generally precludes applications beyond relatively small models. In consequence, it is not yet well understood how integration scales up with the size of a system or with different temporal scales of activity, nor how a system maintains integration as it interacts with its environment. After revising some assumptions of the theory, we show for the first time how modified measures of information integration scale when a neural network becomes very large. Using kinetic Ising models and mean-field approximations, we show that information integration diverges in the thermodynamic limit at certain critical points. Moreover, by comparing different divergent tendencies of blocks that make up a system at these critical points, we can use information integration to delimit the boundary between an integrated unit and its environment. Finally, we present a model that adaptively maintains its integration despite changes in its environment by generating a critical surface where its integrity is preserved. We argue that the exploration of integrated information for these limit cases helps in addressing a variety of poorly understood questions about the organization of biological, neural, and cognitive systems. Idioma: Inglés DOI: 10.1016/j.neunet.2019.03.001 Año: 2019 Publicado en: NEURAL NETWORKS 114 (2019), 136-146 ISSN: 0893-6080 Factor impacto JCR: 5.535 (2019) Categ. JCR: NEUROSCIENCES rank: 42 / 271 = 0.155 (2019) - Q1 - T1 Categ. JCR: COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE rank: 19 / 136 = 0.14 (2019) - Q1 - T1 Factor impacto SCIMAGO: 1.718 - Cognitive Neuroscience (Q1) - Artificial Intelligence (Q1)