Recurrent neural networks (RNNs) have been successfully applied to a variety of problems involving sequential data, but their optimization is sensitive parameter initialization, architecture, and optimizer hyperparameters. Considering RNNs as dynamical systems, natural way capture stability, i.e., the growth decay over long iterates, are Lyapunov Exponents (LEs), which form spectrum. The LEs be...