论文交流 >  自然语言处理

Recurrent Highway Networks

循环高速路网络

1年前 921 0  点赞 (0)  收藏 (0)

研究领域: 自然语言处理   arXiv2017

应用方向:

原理方法:

软件实现:

论文摘要:

Many sequential processing tasks require complex nonlinear transition functions from one step to the next. However, recurrent neural networks with 'deep' transition functions remain difficult to train, even when using Long Short-Term Memory (LSTM) networks. We introduce a novel theoretical analysis of recurrent networks based on Gersgorin's circle theorem that illuminates several modeling and optimization issues and improves our understanding of the LSTM cell. Based on this analysis we propose Recurrent Highway Networks, which extend the LSTM architecture to allow step-to-step transition depths larger than one. Several language modeling experiments demonstrate that the proposed architecture results in powerful and efficient models. On the Penn Treebank corpus, solely increasing the transition depth from 1 to 10 improves word-level perplexity from 90.6 to 65.4 using the same number of parameters. On the larger Wikipedia datasets for character prediction (text8 and enwik8), RHNs outperform all previous results and achieve an entropy of 1.27 bits per character.

论文精要:

论文点评 

您可以在评论中对论文进行“摘要翻译” “标签备注” “精要点评” “疑难提问”,我们会及时更新到数据库中!

任何论文都是 "在特定领域内"、"基于某种学术原理"、"研究某个应用问题", 因此分 领域标签 / 应用标签 / 原理标签 / 补充标签