×
In this paper, we integrate additional linguistic information into a RNNLM, called a factored RNNLM, which can further improve the generalization of RNNLM using ...
This study extends RNNLM by explicitly integrating additional linguistic information, including morphological, syntactic, or semantic factors, ...
Abstract. Among various neural network language models (NNLMs), recurrent neural network-based lan- guage models (RNNLMs) are very competitive in many cases ...
In this study, we extend recurrent neural network-based language models (RNNLMs) by explicitly integrating morphological and syntactic factors (or features).
In this study, we extend recurrent neural network-based lan- guage models (RNNLMs) by explicitly integrating morpho- logical and syntactic factors (or features) ...
People also ask
Factored Language Model based on Recurrent Neural Network. COLING 2012 · Youzheng Wu, Xugang Lu, Hitoshi Yamamoto, Shigeki Matsuda, Chiori Hori, ...
When we interpolate the models linearly, we reduce the perplexity by 15.6% relative on the SEAME evaluation set. This is even slightly better than the result of ...
Apr 27, 2018 · This paper presents a recurrent neural network language model based on the tokenization of words into three parts: the prefix, the stem, and the suffix.
Youzheng Wu, Xugang Lu, Hitoshi Yamamoto, Shigeki Matsuda, Chiori Hori, Hideki Kashioka: Factored Language Model based on Recurrent Neural Network.
In this paper, we investigate the appli- cation of recurrent neural network lan- guage models (RNNLM) and factored language models (FLM) to the task of.