Inception module is one of the most used variants in convolutional neural networks. It has a large portfolio of success cases in computer vision. In the past years, diverse inception flavours, differing in the number of branches, the size and the number of the kernels, have appeared in the scientific literature. They are proposed based on the expertise of the practitioners without any optimization process. In this work, an implementation of population-based incremental learning is proposed for automatic optimization of the (...) hyperparameters of the inception module. This hyperparameters optimization undertakes classification of the MNIST database of handwritten digit images. This problem is widely used as a benchmark in classification, and therefore, the learned best configurations for the Inception module will be of wide use in the deep learning community. In order to reduce the carbon footprint of the optimization process, policies for reducing the redundant evaluations have been undertaken. As a consequence of this work, an evaluation of configurations of the inception module and a mechanism for optimizing hyperparameters in deep learning architectures are stated. (shrink)
Nowadays decision making is strongly supported by the high-confident point estimations produced by deep learning algorithms. In many activities, they are sufficient for the decision-making process. However, in some other cases, confidence intervals are required too for an appropriate decision-making process. In this work, a first attempt to generate point estimations with confidence intervals for the $^{222}$Rn radiation level time series at Canfranc Underground Laboratory is presented. To predict the low-radiation periods allows correctly scheduling the unshielded periods for maintenance operations (...) in the experiments hosted in this facility. This should minimize the deposition of radioactive dust on the exposed surfaces during these unshielded periods. An approach based on deep learning with stochastic regulation is evaluated in the forecasting of point estimations and confidence intervals of the $^{222}$Rn time series and compared with a second approach based on Gaussian processes. As a consequence of this work, an evaluation of the capacity of Gaussian process and deep learning with stochastic regularization for generating point estimations and their confidence intervals for this time series is stated. (shrink)
Inverse percolation is known as the problem of finding the minimum set of nodes whose elimination of their links causes the rupture of the network. Inverse percolation has been widely used in various studies of single-layer networks. However, the use and generalization of multiplex networks have been little considered. In this work, we propose a methodology based on inverse percolation to quantify the robustness of multiplex networks. Specifically, we present a modified version of the mathematical model for the multiplex-vertex separator (...) problem. By solving the m-VSP, we can find nodes that cause the rupture of the mutually connected giant component and the large viable cluster when their links are removed from the network. The methodology presented in this work was tested in a set of benchmark networks, and as case study, we present an analysis using a set of multiplex social networks modeled with information about the main characteristics of the best universities in the world and the universities in Mexico. The results show that the methodology presented in this work can work in different models and types of 2- and 3-layer multiplex networks without dividing the entire multiplex network into single-layer as some techniques described in the specific literature. Furthermore, thanks to the fact that the technique does not require the calculation of some structural measure or centrality metric, and it is easy to scale for networks of different sizes. (shrink)
The travelling salesman problem is one of the most popular problems in combinatorial optimization. It has been frequently used as a benchmark of the performance of evolutionary algorithms. For this reason, nowadays practitioners request new and more difficult instances of this problem. This leads to investigate how to evaluate the intrinsic difficulty of the instances and how to separate ease and difficult instances. By developing methodologies for separating easy- from difficult-to-solve instances, researchers can fairly test the performance of their combinatorial (...) optimizers. In this work, a methodology for evaluating the difficulty of instances of the travelling salesman problem near the optimal solution is proposed. The question is if the fitness landscape near the optimal solution encodes enough information to separate instances in function of their intrinsic difficulty. This methodology is based on the use of a random walk to explore the closeness of the optimal solution. The optimal solution is modified by altering one connection between two cities at each step, at the same time that the fitness of the altered solution is evaluated. This permits evaluating the slope of the fitness landscape. Later, and using the previous information, the difficulty of the instance is evaluated with random forests and artificial neural networks. In this work, this methodology is confronted with a wide set of instances. As a consequence, a methodology to separate the instances of the travelling salesman problem by their degree of difficulty is proposed and evaluated. (shrink)
This work analyzes and characterizes the spread of the COVID-19 disease in Mexico, using complex networks and optimization approaches. Specifically, we present two methodologies based on the principle of the rupture for the GC and Newton's law of motion to quantify the robustness and identify the Mexican municipalities whose population causes a fast spread of the SARS-CoV-2 virus. Specifically, the first methodology is based on several characteristics of the original version of the Vertex Separator Problem, and the second is based (...) on a new mathematical model. By solving VSP, we can find nodes that cause the rupture of the giant component. On the other hand, solving the NLM can find more influential nodes for the entire system’s development. Specifically, we present an analysis using a coupled social network model with information about the main characteristics of the contagion and deaths caused by COVID-19 in Mexico for 19 months. This work aims to show through the approach of complex networks how the spread of the disease behaves, and, thus, researchers from other areas can delve into the characteristics that cause this behavior. (shrink)
Both Carnap and Quine made significant contributions to the philosophy of mathematics despite their diversedviews. Carnap endorsed the dichotomy between analytic and synthetic knowledge and classified certainmathematical questions as internal questions appealing to logic and convention. On the contrary, Quine wasopposed to the analytic-synthetic distinction and promoted a holistic view of scientific inquiry. The purpose of thispaper is to argue that in light of the recent advancement of experimental mathematics such as Monte Carlosimulations, limiting mathematical inquiry to the domain of (...) logic is unjustified. Robustness studies implemented inMonte Carlo Studies demonstrate that mathematics is on par with other experimental-based sciences. (shrink)
We present the first comprehensive taxonomic revision and review the biology of the olingos, the endemic Neotropical procyonid genus Bassaricyon, based on most specimens available in museums, and with data derived from anatomy, morphometrics, mitochondrial and nuclear DNA, field observations, and geographic range modeling. Species of Bassaricyon are primarily forest-living, arboreal, nocturnal, frugivorous, and solitary, and have one young at a time. We demonstrate that four olingo species can be recognized, including a Central American species (Bassaricyon gabbii), lowland species with (...) eastern, cis-Andean (Bassaricyon alleni) and western, trans-Andean (Bassaricyon medius) distributions, and a species endemic to cloud forests in the Andes. The oldest evolutionary divergence in the genus is between this last species, endemic to the Andes of Colombia and Ecuador, and all other species, which occur in lower elevation habitats. Surprisingly, this Andean endemic species, which we call the Olinguito, has never been previously described; it represents a new species in the order Carnivora and is the smallest living member of the family Procyonidae. We report on the biology of this new species based on information from museum specimens, niche modeling, and fieldwork in western Ecuador, and describe four Olinguito subspecies based on morphological distinctions across different regions of the Northern Andes. (shrink)
On this note we review the attempts by Fernández Liria and Alegre to diagnose modernity and its outcome in populism as a phenomenon which condenses some of its aporias. basing on Robespierre, Hegel or Schmitt we discuss the authors’ theses and propose an answer to the questions of Law, nation or State we point out an approach to the problems of universality, identity and multipolarity.
The neural correlates of software programming skills have been the target of an increasing number of studies in the past few years. Those studies focused on error-monitoring during software code inspection. Others have studied task-related cognitive load as measured by distinct neurophysiological measures. Most studies addressed only syntax errors. However, a recent functional MRI study suggested a pivotal role of the insula during error-monitoring when challenging deep-level analysis of code inspection was required. This raised the hypothesis that the insula is (...) causally involved in deep error-monitoring. To confirm this hypothesis, we carried out a new fMRI study where participants performed a deep source-code comprehension task that included error-monitoring to detect bugs in the code. The generality of our paradigm was enhanced by comparison with a variety of tasks related to text reading and bugless source-code understanding. Healthy adult programmers participated in this 3T fMRI experiment. The activation maps evoked by error-related events confirmed significant activations in the insula [p < 0.05]. Importantly, a posterior-to-anterior causality shift was observed concerning the role of the insula: in the absence of error, causal directions were mainly bottom-up, whereas, in their presence, the strong causal top-down effects from frontal regions, in particular, the anterior cingulate cortex was observed. (shrink)
In the present paper we study the framework of additive utility theory, obtaining new results derived from a concurrence of algebraic and topological techniques. Such techniques lean on the concept of a connected topological totally ordered semigroup. We achieve a general result concerning the existence of continuous and additive utility functions on completely preordered sets endowed with a binary operation ``+'', not necessarily being commutative or associative. In the final part of the paper we get some applications to expected utility (...) theory, and a representation theorem for a class of complete preorders on a quite general family of real mixture spaces. (shrink)
Industry 4.0 in the contemporary operating context carries important sources of complexity. This context generates both traditional risks and emerging risks that must be managed. The management of these risks includes both industrial risks and occupational risks, since they are heavily interlinked. The human factor can be considered the main link between both types of risks. Thus, understanding risks originating from human errors and organizational weaknesses as causes of accidents and other disruptions in complex systems requires elaborating sophisticated modeling approaches. (...) Therefore, the objective of this paper is to propose an organizational and human performance approach to improve the emerging risk management linked to the complex systems, like as Human-Machine Interactions and Human-Robot Interaction. To fulfill this objective, we first introduce the concept of emerging risk linked to human factor. Then, we introduce the concept of emerging risk management in the Industry 4.0 context. Under this complex context, we expose the concept considering the current models of risk management. Finally, we discuss how enhancing human and organizational performance can be achieved through risk management in complex systems linked to Industry 4.0. Therefore, we conclude that while Industry 4.0 brings numerous advantages, it must contend with emerging risks and challenges associated with organizational and human factors. These emerging risks include industrial risks as well as occupational risks. Moreover, the human factor aspect of Industry 4.0 is directly linked to industrial emerging and occupational emerging via context of operations. To cope with these new challenges, it is necessary to develop new approaches. One of such approaches is Complex System Governance. This approach is discussed along with the need for adequate organizational and human performance models dealing with, for example, experience from other domains such as nuclear, space, aviation, and petrochemical. (shrink)
La profundización del análisis de la relación entre el yo y la afección, propia de los estudios tardíos de Husserl sobre el campo de la pasividad, implica el abandono del esquema estático según el cual el sentido resulta de la acción del yo sobre una materia inerte. El giro genético de la fenomenología pone a la luz los procesos de constitución de la materia y su incidencia sobre el acto de donación de sentido.
In this research we analyzed the relationship between threatening economic contexts and trust in authoritarian ideologies and leaders, regardless of the left–right political axis. Based on two theoretical approaches, we argue that this relationship is mediated by dangerous worldview and low perceived sociopolitical control. We conducted two correlational studies with samples of the general population. In Study 1, we found that perceived threat from the economic crisis and low socioeconomic status were correlated with a higher dangerous worldview, which resulted in (...) a more authoritarian ideology and finally in greater trust in an authoritarian political leader. In Study 2, we replicated the findings of Study 1 and demonstrated that low perceived sociopolitical control was associated with higher authoritarianism. Moreover, low perceived sociopolitical control partially mediated the relationship between dangerous worldview and authoritarianism. Overall, our results show that two economically threatening contexts promote authoritarianism and trust in authoritarian leaders through psychological processes. These results are useful to understand and combat the rise of authoritarianism in our societies during financially difficult times such as economic crises. (shrink)
In 2016, Béziau introduced the notion of genuine paraconsistent logic as logic that does not verify the principle of non-contradiction; as an important example, he presented the genuine paraconsist...