Incremental Adaptation Strategies for Neural Network Language Models

نویسندگان

  • Aram Ter-Sarkisov
  • Holger Schwenk
  • Fethi Bougares
  • Loïc Barrault
چکیده

It is today acknowledged that neural network language models outperform backoff language models in applications like speech recognition or statistical machine translation. However, training these models on large amounts of data can take several days. We present efficient techniques to adapt a neural network language model to new data. Instead of training a completely new model or relying on mixture approaches, we propose two new methods: continued training on resampled data or insertion of adaptation layers. We present experimental results in an CAT environment where the post-edits of professional translators are used to improve an SMT system. Both methods are very fast and achieve significant improvements without over-fitting the small adaptation data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Predicting Force in Single Point Incremental Forming by Using Artificial Neural Network

In this study, an artificial neural network was used to predict the minimum force required to single point incremental forming (SPIF) of thin sheets of Aluminium AA3003-O and calamine brass Cu67Zn33 alloy. Accordingly, the parameters for processing, i.e., step depth, the feed rate of the tool, spindle speed, wall angle, thickness of metal sheets and type of material were selected as input and t...

متن کامل

Comparison of scheduling methods for the learning rate of neural network language models (Modèles de langue neuronaux: une comparaison de plusieurs stratégies d'apprentissage) [in French]

If neural networks play an increasingly important role in natural language processing, training issues still hinder their dissemination in the community. This paper studies different learning strategies for neural language models (including two new strategies), focusing on the adaptation of the learning rate. Experimental results show the impact of the design of such strategy. Moreover, provide...

متن کامل

Incremental adaptive networks implemented by free space optical (FSO) communication

The aim of this paper is to fully analyze the effects of free space optical (FSO) communication links on the estimation performance of the adaptive incremental networks. The FSO links in this paper are described with two turbulence models namely the Log-normal and Gamma-Gamma distributions. In order to investigate the impact of these models we produced the link coefficients using these distribu...

متن کامل

A Comparative Study on Regularization Strategies for Embedding-based Neural Networks

This paper aims to compare different regularization strategies to address a common phenomenon, severe overfitting, in embedding-based neural networks for NLP. We chose two widely studied neural models and tasks as our testbed. We tried several frequently applied or newly proposed regularization strategies, including penalizing weights (embeddings excluded), penalizing embeddings, reembedding wo...

متن کامل

Cooling Growing Grid: an incremental self-organizing neural network for data exploration

Fundamental self-organizing artificial neural networks, both static (with predefined number of neurons) and incremental, are presented and goals of competitive learning are enumerated. A novel incremental self-organizing ANN Cooling Growing Grid (CGG) is proposed, which combines the advantages of static and incremental approaches and overcomes their main drawbacks. The estimation of growth dire...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1412.6650  شماره 

صفحات  -

تاریخ انتشار 2014