Learning Long{term Dependencies Is Not as Diicult with Narx Recurrent Neural Networks

نویسندگان

  • Tsungnan Lin
  • Bill G. Horne
چکیده

It has recently been shown that gradient descent learning algorithms for recurrent neural networks can perform poorly on tasks that involve long{term dependencies, i.e. those problems for which the desired output depends on inputs presented at times far in the past. In this paper we explore the long{term dependencies problem for a class of architectures called NARX recurrent neural networks, which have powerful representational capabilities. We have previously reported that gradient descent learning is more eeective in NARX networks than in recurrent neural network architectures that have \hidden states" on problems including grammatical inference and nonlinear system identiication. Typically, the network converges much faster and generalizes better than other networks. The results in this paper are an attempt to explain this phenomenon. We present some experimental results which show that NARX networks can often retain information for two to three times as long as conventional recurrent neural networks. We show that although NARX networks do not circumvent the problem of long{term dependencies, they can greatly improve performance on long-term dependency problems. We also describe in detail some of the assumption regarding what it means to latch information robustly and suggest possible ways to loosen these assumptions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

How embedded memory in recurrent neural network architectures helps learning long-term temporal dependencies

Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependency problems. The intuitive explanation for this behavior is that the output memories of a NARX ne...

متن کامل

Learning long-term dependencies in NARX recurrent neural networks

It has previously been shown that gradient-descent learning algorithms for recurrent neural networks can perform poorly on tasks that involve long-term dependencies, i.e. those problems for which the desired output depends on inputs presented at times far in the past. We show that the long-term dependencies problem is lessened for a class of architectures called nonlinear autoregressive models ...

متن کامل

Learning long-term dependencies is not as difficult with NARX networks

Bill G. Horne NEC Research Institute 4 Independence Way Princeton, NJ 08540 c. Lee Gilest NEC Research Institute 4 Independence Way Princeton, N J 08540 It has recently been shown that gradient descent learning algorithms for recurrent neural networks can perform poorly on tasks that involve long-term dependencies. In this paper we explore this problem for a class of architectures called NARX n...

متن کامل

Learning Long-Term Dependencies in NARX Recurrent Neural Networks - Neural Networks, IEEE Transactions on

It has recently been shown that gradient-descent learning algorithms for recurrent neural networks can perform poorly on tasks that involve long-term dependencies, i.e., those problems for which the desired output depends on inputs presented at times far in the past. We show that the long-term dependencies problem is lessened for a class of architectures called Nonlinear AutoRegressive models w...

متن کامل

Pii: S0893-6080(98)00018-5

Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependency problems. The intuitive explanation for this behavior is that the output memories of a NARX ne...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996