Recurrent neural networks (RNNs) are an excellent fit for regression problems where sequential data the norm since their recurrent internal structure can analyse and process long. However, RNNs prone to phenomenal vanishing gradient problem (VGP) that causes network stop learning generate poor prediction accuracy, especially in long-term dependencies. Originally, gated units such as long short-...