CS224D Final Report: Deep Recurrent Attention Networks for LATEX to Source

نویسندگان

  • Keegan Go
  • Kenji Hata
چکیده

For our project, we wanted to explore the problem of recognizing LTEX expressions and translating them to source using a deep neural network. Previous work involving attention models have improved sequence to sequence mappings and greatly helped in digit recognition. Inspired by these previous work, we implemented an attention model to recognize simple LTEX expressions and also tested it on a small subset of CROHME, a dataset of handwritten mathematical expressions. We thoroughly explain our network, training procedure, hyperparameter tuning, and results. We achieve a classification accuracy of 63.4% on an automatically generated dataset and an accuracy of XXXXXX on CROHME.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

CS224D Project Final Report Summarizing Reviews and Predicting Rating for Yelp Dataset

The report explores the use of Recurrent Neural Networks (RNN) and Convolutional Neural Networks (CNN) in summarising text reviews and predicting review rating for the Yelp dataset. I use the fact that the reviews are labelled (by rating) to extract important sentences/words, which are then used as the summary for the review. I use an interesting evaluation technique to measure the relevance of...

متن کامل

CS224d Project Final Report

We develop a Recurrent Neural Network (RNN) Language Model to extract sentences from Yelp Review Data for the purpose of automatic summarization. We compare these extracted sentences against user-generated tips in the Yelp Academic Dataset using ROUGE and BLEU metrics for summarization evaluation. The performance of a uni-directional RNN is compared against word-vector averaging.

متن کامل

Extensions to Tree-Recursive Neural Networks for Natural Language Inference

Understanding textual entailment and contradiction is considered fundamental to natural language understanding. Tree-recursive neural networks, which exploit valuable syntactic parse information, achieve state-of-the-art accuracy among pure sentence encoding models for this task. In this course project for CS224D, we explore two extensions to tree-recursive neural networks deep TreeLSTMs and at...

متن کامل

2017 Formatting Instructions for Authors Using LaTeX

Deep convolutional and recurrent neural networks have delivered significant advancements in object detection and tracking. However, current models handle detection and tracking through separate networks, and deep-learning-based joint detection and tracking has not yet been explored despite its potential benefits to both tasks. In this study, we present an integrated neural model called the Recu...

متن کامل

Cross-Lingual Pronoun Prediction with Deep Recurrent Neural Networks

In this paper we present our winning system in the WMT16 Shared Task on CrossLingual Pronoun Prediction, where the objective is to predict a missing target language pronoun based on the target and source sentences. Our system is a deep recurrent neural network, which reads both the source language and target language context with a softmax layer making the final prediction. Our system achieves ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016