Extracting and learning an unknown grammar with recurrent neural networks
Abstract
References
Index Terms
- Extracting and learning an unknown grammar with recurrent neural networks
Recommendations
Simultaneous perturbation learning rule for recurrent neural networks and its FPGA implementation
Recurrent neural networks have interesting properties and can handle dynamic information processing unlike ordinary feedforward neural networks. However, they are generally difficult to use because there is no convenient learning scheme. In this paper, ...
A machine learning method for extracting symbolic knowledge from recurrent neural networks
Neural networks do not readily provide an explanation of the knowledge stored in their weights as part of their information processing. Until recently, neural networks were considered to be black boxes, with the knowledge stored in their weights not ...
Natural language learning by recurrent neural networks: a comparison with probabilistic approaches
NeMLaP3/CoNLL '98: Proceedings of the Joint Conferences on New Methods in Language Processing and Computational Natural Language LearningWe present preliminary results of experiments with two types of recurrent neural networks for a natural language learning task. The neural networks, Elman networks and Recurrent Cascade Correlation (RCC), were trained on the text of a first-year primary ...
Comments
Information & Contributors
Information
Published In
Publisher
Morgan Kaufmann Publishers Inc.
San Francisco, CA, United States
Publication History
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0