Results for 'Recurrent neural networks'

1000+ found
Order:
  1.  64
    Recurrent neural network-based models for recognizing requisite and effectuation parts in legal texts.Truong-Son Nguyen, Le-Minh Nguyen, Satoshi Tojo, Ken Satoh & Akira Shimazu - 2018 - Artificial Intelligence and Law 26 (2):169-199.
    This paper proposes several recurrent neural network-based models for recognizing requisite and effectuation parts in Legal Texts. Firstly, we propose a modification of BiLSTM-CRF model that allows the use of external features to improve the performance of deep learning models in case large annotated corpora are not available. However, this model can only recognize RE parts which are not overlapped. Secondly, we propose two approaches for recognizing overlapping RE parts including the cascading approach which uses the sequence of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  2.  16
    Training Recurrent Neural Networks Using Optimization Layer-by- Layer Recursive Least Squares Algorithm for Vibration Signals System Identification and Fault Diagnostic Analysis.S. -Y. Cho, T. W. S. Chow & Y. Fang - 2001 - Journal of Intelligent Systems 11 (2):125-154.
  3.  6
    A Recurrent Neural Network for Attenuating Non-cognitive Components of Pupil Dynamics.Sharath Koorathota, Kaveri Thakoor, Linbi Hong, Yaoli Mao, Patrick Adelman & Paul Sajda - 2021 - Frontiers in Psychology 12.
    There is increasing interest in how the pupil dynamics of the eye reflect underlying cognitive processes and brain states. Problematic, however, is that pupil changes can be due to non-cognitive factors, for example luminance changes in the environment, accommodation and movement. In this paper we consider how by modeling the response of the pupil in real-world environments we can capture the non-cognitive related changes and remove these to extract a residual signal which is a better index of cognition and performance. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  4.  31
    Convolutional Recurrent Neural Network for Fault Diagnosis of High-Speed Train Bogie.Kaiwei Liang, Na Qin, Deqing Huang & Yuanzhe Fu - 2018 - Complexity 2018:1-13.
    Timely detection and efficient recognition of fault are challenging for the bogie of high-speed train, owing to the fact that different types of fault signals have similar characteristics in the same frequency range. Notice that convolutional neural networks are powerful in extracting high-level local features and that recurrent neural networks are capable of learning long-term context dependencies in vibration signals. In this paper, by combining CNN and RNN, a so-called convolutional recurrent neural network (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  5.  76
    Can Recurrent Neural Networks Validate Usage-Based Theories of Grammar Acquisition?Ludovica Pannitto & Aurelie Herbelot - 2022 - Frontiers in Psychology 13.
    It has been shown that Recurrent Artificial Neural Networks automatically acquire some grammatical knowledge in the course of performing linguistic prediction tasks. The extent to which such networks can actually learn grammar is still an object of investigation. However, being mostly data-driven, they provide a natural testbed for usage-based theories of language acquisition. This mini-review gives an overview of the state of the field, focusing on the influence of the theoretical framework in the interpretation of results.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  6.  6
    A Novel Recurrent Neural Network to Classify EEG Signals for Customers' Decision-Making Behavior Prediction in Brand Extension Scenario.Qingguo Ma, Manlin Wang, Linfeng Hu, Linanzi Zhang & Zhongling Hua - 2021 - Frontiers in Human Neuroscience 15.
    It was meaningful to predict the customers' decision-making behavior in the field of market. However, due to individual differences and complex, non-linear natures of the electroencephalogram signals, it was hard to classify the EEG signals and to predict customers' decisions by using traditional classification methods. To solve the aforementioned problems, a recurrent t-distributed stochastic neighbor embedding neural network was proposed in current study to classify the EEG signals in the designed brand extension paradigm and to predict the participants' (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  7.  8
    An efficient recurrent neural network with ensemble classifier-based weighted model for disease prediction.Ramesh Kumar Krishnamoorthy & Tamilselvi Kesavan - 2022 - Journal of Intelligent Systems 31 (1):979-991.
    Day-to-day lives are affected globally by the epidemic coronavirus 2019. With an increasing number of positive cases, India has now become a highly affected country. Chronic diseases affect individuals with no time identification and impose a huge disease burden on society. In this article, an Efficient Recurrent Neural Network with Ensemble Classifier is built using VGG-16 and Alexnet with weighted model to predict disease and its level. The dataset is partitioned randomly into small subsets by utilizing mean-based splitting (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  8.  8
    A Gradient-Based Recurrent Neural Network for Visual Servoing of Robot Manipulators with Acceleration Command.Zhiguan Huang, Zhengtai Xie, Long Jin & Yuhe Li - 2020 - Complexity 2020:1-11.
    Recent decades have witnessed the rapid evolution of robotic applications and their expansion into a variety of spheres with remarkable achievements. This article researches a crucial technique of robot manipulators referred to as visual servoing, which relies on the visual feedback to respond to the external information. In this regard, the visual servoing issue is tactfully transformed into a quadratic programming problem with equality and inequality constraints. Differing from the traditional methods, a gradient-based recurrent neural network for solving (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  9.  36
    State estimation of memristor‐based recurrent neural networks with time‐varying delays based on passivity theory.R. Rakkiyappan, A. Chandrasekar, S. Laksmanan & Ju H. Park - 2014 - Complexity 19 (4):32-43.
    Direct download  
     
    Export citation  
     
    Bookmark   18 citations  
  10.  25
    Perspectives and challenges for recurrent neural network training.Marco Gori, Barbara Hammer, Pascal Hitzler & Guenther Palm - 2010 - Logic Journal of the IGPL 18 (5):617-619.
  11.  6
    A New Recurrent Neural Network Learning Algorithm for Time Series Prediction.P. G. Madhavan - 1997 - Journal of Intelligent Systems 7 (1-2):103-116.
    Direct download  
     
    Export citation  
     
    Bookmark  
  12.  9
    Discriminatively trained continuous Hindi speech recognition using integrated acoustic features and recurrent neural network language modeling.R. K. Aggarwal & A. Kumar - 2020 - Journal of Intelligent Systems 30 (1):165-179.
    This paper implements the continuous Hindi Automatic Speech Recognition (ASR) system using the proposed integrated features vector with Recurrent Neural Network (RNN) based Language Modeling (LM). The proposed system also implements the speaker adaptation using Maximum-Likelihood Linear Regression (MLLR) and Constrained Maximum likelihood Linear Regression (C-MLLR). This system is discriminatively trained by Maximum Mutual Information (MMI) and Minimum Phone Error (MPE) techniques with 256 Gaussian mixture per Hidden Markov Model(HMM) state. The training of the baseline system has been (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  13.  28
    Short-term memory for serial order: A recurrent neural network model.Matthew M. Botvinick & David C. Plaut - 2006 - Psychological Review 113 (2):201-233.
  14.  20
    Stability analysis of memristor-based complex-valued recurrent neural networks with time delays.Rajan Rakkiyappan, Gandhi Velmurugan, Fathalla A. Rihan & Shanmugam Lakshmanan - 2016 - Complexity 21 (4):14-39.
    Direct download  
     
    Export citation  
     
    Bookmark   7 citations  
  15. To transform the phenomena: Feyerabend, proliferation, and recurrent neural networks.Paul M. Churchland - 1997 - Philosophy of Science 64 (4):420.
    Paul Feyerabend recommended the methodological policy of proliferating competing theories as a means to uncovering new empirical data, and thus as a means to increase the empirical constraints that all theories must confront. Feyerabend's policy is here defended as a clear consequence of connectionist models of explanatory understanding and learning. An earlier connectionist "vindication" is criticized, and a more realistic and penetrating account is offered in terms of the computationally plastic cognitive profile displayed by neural networks with a (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  16.  10
    Finding event structure in time: What recurrent neural networks can tell us about event structure in mind.Forrest Davis & Gerry T. M. Altmann - 2021 - Cognition 213 (C):104651.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17. Distribution and frequency: Modeling the effects of speaking rate on category boundaries using a recurrent neural network.Mukhlis Abu-Bakar & Nick Chater - 1994 - In Ashwin Ram & Kurt Eiselt (eds.), Proceedings of the Sixteenth Annual Conference of the Cognitive Science Society. Erlbaum.
     
    Export citation  
     
    Bookmark  
  18.  15
    Interperforming in AI: question of ‘natural’ in machine learning and recurrent neural networks.Tolga Yalur - 2020 - AI and Society 35 (3):737-745.
    This article offers a critical inquiry of contemporary neural network models as an instance of machine learning, from an interdisciplinary perspective of AI studies and performativity. It shows the limits on the architecture of these network systems due to the misemployment of ‘natural’ performance, and it offers ‘context’ as a variable from a performative approach, instead of a constant. The article begins with a brief review of machine learning-based natural language processing systems and continues with a concentration on the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  19.  24
    Finite-Time Synchronization for Complex-Valued Recurrent Neural Networks with Time Delays.Ziye Zhang, Xiaoping Liu, Chong Lin & Bing Chen - 2018 - Complexity 2018:1-14.
    This paper focuses on the finite-time synchronization analysis for complex-valued recurrent neural networks with time delays. First, two kinds of common activation functions appearing in the existing references are combined together and more general assumptions are given. To achieve our aim, a nonlinear delayed controller with two independent parameters different from the existing ones is provided, which leads to great difficulty. To overcome it, a newly developed inequality is used. Then, via Lyapunov function approach, some criteria are (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  20.  20
    Improved Identification of Complex Temporal Systems with Dynamic Recurrent Neural Networks. Application to the Identification of Electromyography and Human Arm Trajectory Relationship.Jean-Philippe Draye, Guy Cheron & Marc Bourgeois - 1997 - Journal of Intelligent Systems 7 (1-2):83-102.
  21.  18
    Bangla hate speech detection on social media using attention-based recurrent neural network.Md Nur Hossain, Anik Paul, Abdullah Al Asif & Amit Kumar Das - 2021 - Journal of Intelligent Systems 30 (1):578-591.
    Hate speech has spread more rapidly through the daily use of technology and, most notably, by sharing your opinions or feelings on social media in a negative aspect. Although numerous works have been carried out in detecting hate speeches in English, German, and other languages, very few works have been carried out in the context of the Bengali language. In contrast, millions of people communicate on social media in Bengali. The few existing works that have been carried out need improvements (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  22.  18
    A New Type of Eye Movement Model Based on Recurrent Neural Networks for Simulating the Gaze Behavior of Human Reading.Xiaoming Wang, Xinbo Zhao & Jinchang Ren - 2019 - Complexity 2019:1-12.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  23.  7
    Correction to: Interperforming in AI: question of ‘natural’ in machine learning and recurrent neural networks.Tolga Yalur - 2020 - AI and Society 35 (3):775-775.
  24.  40
    Research on UUV Obstacle Avoiding Method Based on Recurrent Neural Networks.Changjian Lin, Hongjian Wang, Jianya Yuan, Dan Yu & Chengfeng Li - 2019 - Complexity 2019:1-16.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  25. Neurodynamic and Particle Swarm Optimization-A Recurrent Neural Network for Non-smooth Convex Programming Subject to Linear Equality and Bound Constraints.Qingshan Liu & Jun Wang - 2006 - In O. Stock & M. Schaerf (eds.), Lecture Notes in Computer Science. Springer Verlag. pp. 4233--1004.
  26. Unity, association, and dissociation of temporal consciousness in recurrent neural networks.D. Lloyd - 2000 - Consciousness and Cognition 9 (2):S17 - S18.
     
    Export citation  
     
    Bookmark  
  27.  21
    Recurrent quantum neural network and its applications.Laxmidhar Behera, Indrani Kar & Avshalom C. Elitzur - 2006 - In J. Tuszynski (ed.), The Emerging Physics of Consciousness. Springer Verlag. pp. 327--350.
  28.  94
    Recurrent Convolutional Neural Networks: A Better Model of Biological Object Recognition.Courtney J. Spoerer, Patrick McClure & Nikolaus Kriegeskorte - 2017 - Frontiers in Psychology 8.
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  29.  16
    Corrigendum: Recurrent Convolutional Neural Networks: A Better Model of Biological Object Recognition.Courtney J. Spoerer, Patrick McClure & Nikolaus Kriegeskorte - 2018 - Frontiers in Psychology 9.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  30.  28
    Learning Orthographic Structure With Sequential Generative Neural Networks.Alberto Testolin, Ivilin Stoianov, Alessandro Sperduti & Marco Zorzi - 2016 - Cognitive Science 40 (3):579-606.
    Learning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in connectionist modeling. Here, we investigated a sequential version of the restricted Boltzmann machine, a stochastic recurrent neural network that extracts high-order structure from sensory data through unsupervised generative learning and can (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  31.  98
    Phenomenology, dynamical neural networks and brain function.Donald Borrett, Sean D. Kelly & Hon Kwan - 2000 - Philosophical Psychology 13 (2):213-228.
    Current cognitive science models of perception and action assume that the objects that we move toward and perceive are represented as determinate in our experience of them. A proper phenomenology of perception and action, however, shows that we experience objects indeterminately when we are perceiving them or moving toward them. This indeterminacy, as it relates to simple movement and perception, is captured in the proposed phenomenologically based recurrent network models of brain function. These models provide a possible foundation from (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  32.  5
    Multi-channel Convolutional Neural Network Feature Extraction for Session Based Recommendation.Zhenyan Ji, Mengdan Wu, Yumin Feng & José Enrique Armendáriz Íñigo - 2021 - Complexity 2021:1-10.
    A session-based recommendation system is designed to predict the user’s next click behavior based on an ongoing session. Existing session-based recommendation systems usually model a session into a sequence and extract sequence features through recurrent neural network. Although the performance is greatly improved, these procedures ignore the relationships between items that contain rich information. In order to obtain rich items embeddings, we propose a novel Recommendation Model based on Multi-channel Convolutional Neural Network for session-based recommendation, RMMCNN for (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  33.  10
    Tracking Child Language Development With Neural Network Language Models.Kenji Sagae - 2021 - Frontiers in Psychology 12.
    Recent work on the application of neural networks to language modeling has shown that models based on certain neural architectures can capture syntactic information from utterances and sentences even when not given an explicitly syntactic objective. We examine whether a fully data-driven model of language development that uses a recurrent neural network encoder for utterances can track how child language utterances change over the course of language development in a way that is comparable to what (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  34.  50
    Estimation and application of matrix eigenvalues based on deep neural network.Zhiying Hu - 2022 - Journal of Intelligent Systems 31 (1):1246-1261.
    In today’s era of rapid development in science and technology, the development of digital technology has increasingly higher requirements for data processing functions. The matrix signal commonly used in engineering applications also puts forward higher requirements for processing speed. The eigenvalues of the matrix represent many characteristics of the matrix. Its mathematical meaning represents the expansion of the inherent vector, and its physical meaning represents the spectrum of vibration. The eigenvalue of a matrix is the focus of matrix theory. The (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  35.  4
    Emotion Analysis of Ideological and Political Education Using a GRU Deep Neural Network.Shoucheng Shen & Jinling Fan - 2022 - Frontiers in Psychology 13.
    Theoretical research into the emotional attributes of ideological and political education can improve our ability to understand human emotion and solve socio-emotional problems. To that end, this study undertook an analysis of emotion in ideological and political education by integrating a gate recurrent unit with an attention mechanism. Based on the good results achieved by BERT in the downstream network, we use the long focusing attention mechanism assisted by two-way GRU to extract relevant information and global information of ideological (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  36.  30
    On bifurcations and chaos in random neural networks.B. Doyon, B. Cessac, M. Quoy & M. Samuelides - 1994 - Acta Biotheoretica 42 (2-3):215-225.
    Chaos in nervous system is a fascinating but controversial field of investigation. To approach the role of chaos in the real brain, we theoretically and numerically investigate the occurrence of chaos inartificial neural networks. Most of the time, recurrent networks (with feedbacks) are fully connected. This architecture being not biologically plausible, the occurrence of chaos is studied here for a randomly diluted architecture. By normalizing the variance of synaptic weights, we produce a bifurcation parameter, dependent on (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  37.  7
    Advantages of Combining Factorization Machine with Elman Neural Network for Volatility Forecasting of Stock Market.Fang Wang, Sai Tang & Menggang Li - 2021 - Complexity 2021:1-12.
    With a focus in the financial market, stock market dynamics forecasting has received much attention. Predicting stock market fluctuations is usually challenging due to the nonlinear and nonstationary time series of stock prices. The Elman recurrent network is renowned for its capability of dealing with dynamic information, which has made it a successful application to predicting. We developed a hybrid approach which combined Elman recurrent network with factorization machine technique, i.e., the FM-Elman neural network, to predict stock (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  38.  11
    Recurrent Fuzzy-Neural MIMO Channel Modeling.Abhijit Mitra & Kandarpa Kumar Sarma - 2012 - Journal of Intelligent Systems 21 (2):121-142.
    . Fuzzy systems and artificial neural networks, as important components of soft-computation, can be applied together to model uncertainty. A composite block of the fuzzy system and the ANN shares a mutually beneficial association resulting in enhanced performance with smaller networks. It makes them suitable for application with time-varying multi-input multi-output channel modeling enabling such a system to track minute variations in propagation conditions. Here we propose a fuzzy neural system using a fuzzy time delay fully (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  39.  14
    Learning Representations of Wordforms With Recurrent Networks: Comment on Sibley, Kello, Plaut, & Elman (2008).Jeffrey S. Bowers & Colin J. Davis - 2009 - Cognitive Science 33 (7):1183-1186.
    Sibley et al. (2008) report a recurrent neural network model designed to learn wordform representations suitable for written and spoken word identification. The authors claim that their sequence encoder network overcomes a key limitation associated with models that code letters by position (e.g., CAT might be coded as C‐in‐position‐1, A‐in‐position‐2, T‐in‐position‐3). The problem with coding letters by position (slot‐coding) is that it is difficult to generalize knowledge across positions; for example, the overlap between CAT and TOMCAT is lost. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  40.  28
    Deep Recurrent Model for Server Load and Performance Prediction in Data Center.Zheng Huang, Jiajun Peng, Huijuan Lian, Jie Guo & Weidong Qiu - 2017 - Complexity:1-10.
    Recurrent neural network has been widely applied to many sequential tagging tasks such as natural language process and time series analysis, and it has been proved that RNN works well in those areas. In this paper, we propose using RNN with long short-term memory units for server load and performance prediction. Classical methods for performance prediction focus on building relation between performance and time domain, which makes a lot of unrealistic hypotheses. Our model is built based on events, (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  41. Recurrent networks: learning algorithms.Kenji Doya - 2002 - In The Handbook of Brain Theory and Neural Networks. pp. 955--960.
  42.  32
    Through Neural Stimulation to Behavior Manipulation: A Novel Method for Analyzing Dynamical Cognitive Models.Thomas Hope, Ivilin Stoianov & Marco Zorzi - 2010 - Cognitive Science 34 (3):406-433.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  43. Supervised learning in recurrent networks.Kenji Doya - 1995 - In Michael A. Arbib (ed.), Handbook of Brain Theory and Neural Networks. MIT Press.
  44.  38
    Internal Recurrence.Don Ross - 1998 - Dialogue 37 (1):155-162.
    Paul Churchland does not open his latest book,The Engine of Reason, the Seat of the Soul, modestly. He begins by announcing, “This book is about you. And me … More broadly still, it is about every creature that ever swam, or walked, or flew over the face of the Earth” (p. 3). A few sentences later, he says, “Fortunately, recent research into neural networks … has produced the beginnings of a real understanding of how the biological brain works—a (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  45.  27
    Neural Machine Translation System for English to Indian Language Translation Using MTIL Parallel Corpus.K. P. Soman, M. Anand Kumar & B. Premjith - 2019 - Journal of Intelligent Systems 28 (3):387-398.
    Introduction of deep neural networks to the machine translation research ameliorated conventional machine translation systems in multiple ways, specifically in terms of translation quality. The ability of deep neural networks to learn a sensible representation of words is one of the major reasons for this improvement. Despite machine translation using deep neural architecture is showing state-of-the-art results in translating European languages, we cannot directly apply these algorithms in Indian languages mainly because of two reasons: unavailability (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  46.  22
    The Dynamics of Neural Populations Capture the Laws of the Mind.Gregor Schöner - 2020 - Topics in Cognitive Science 12 (4):1257-1271.
    The dynamics of neural populations capture the laws of the mindThis paper focuses on the level of neural networks. Examining the case of recurrent neural networks, the paper argues that the dynamics of neural populations form a privileged level of explanation in cognitive science. According to Schöner, this level is privileged, because it enables cognitive scientists to discover the laws governing organisms’ cognition and behaviour.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  47.  13
    Explaining Neural Transitions through Resource Constraints.Colin Klein - 2022 - Philosophy of Science 89 (5):1196-1202.
    One challenge in explaining neural evolution is the formal equivalence of different computational architectures. If a simple architecture suffices, why should more complex neural architectures evolve? The answer must involve the intense competition for resources under which brains operate. I show how recurrent neural networks can be favored when increased complexity allows for more efficient use of existing resources. Although resource constraints alone can drive a change, recurrence shifts the landscape of what is later evolvable. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  48.  27
    Internal recurrence.Don Ross - 1998 - Dialogue 37 (1):155-161.
    It is crucial, first of all, to stress the importance Churchland attaches to the idea that the neural networks whose assemblages he holds to be “engines of reason” must be recurrent. Non-recurrent networks, of the sort best known among philosophers, simply discover patterns in input data presented to them as sets of features. The learning capacities of such networks, extensively discussed since the publication of Rumelhart and McClelland et al., are indeed impressive; and Churchland (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark  
  49.  5
    Deep Echo State Network with Variable Memory Pattern for Solar Irradiance Prediction.Qian Li, Tao Li, Jiangang Ouyang, Dayong Yang & Zhijun Guo - 2022 - Complexity 2022:1-11.
    Accurate solar irradiance prediction plays an important role in ensuring the security and stability of renewable energy systems. Solar irradiance modeling is usually a time-dependent dynamic model. As a new kind of recurrent neural network, echo state network shows excellent performance in the field of time series prediction. However, the memory length of classical ESN is fixed and finite, which makes it hard to map sufficient features of solar irradiance with long-range dependency. Therefore, a novel deep echo state (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  50. Uncertainty Reduction as a Measure of Cognitive Load in Sentence Comprehension.Stefan L. Frank - 2013 - Topics in Cognitive Science 5 (3):475-494.
    The entropy-reduction hypothesis claims that the cognitive processing difficulty on a word in sentence context is determined by the word's effect on the uncertainty about the sentence. Here, this hypothesis is tested more thoroughly than has been done before, using a recurrent neural network for estimating entropy and self-paced reading for obtaining measures of cognitive processing load. Results show a positive relation between reading time on a word and the reduction in entropy due to processing that word, supporting (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   11 citations  
1 — 50 / 1000