Skip to content

Latest commit

 

History

History
132 lines (125 loc) · 25.8 KB

File metadata and controls

132 lines (125 loc) · 25.8 KB

natural language process

Deep learning

  • Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond. [url]
  • AC-BLSTM: Asymmetric Convolutional Bidirectional LSTM Networks for Text Classification. [arxiv]
  • Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models. [arxiv] ⭐
  • A Character-level Decoder without Explicit Segmentation for Neural Machine Translation. [[pdf]](docs/2016/A Character-level Decoder without Explicit Segmentation for Neural Machine Translation.pdf) [url] ⭐
  • Achieving Human Parity in Conversational Speech Recognition. [arxiv]
  • A General Framework for Content-enhanced Network Representation Learning. [arxiv]
  • A Joint Many-Task Model- Growing a Neural Network for Multiple NLP Tasks. [url]
  • A Semisupervised Approach for Language Identification based on Ladder Networks. [[pdf](docs/2016/A Semisupervised Approach for Language Identification based on Ladder Networks.pdf)] [url]
  • A Simple, Fast Diverse Decoding Algorithm for Neural Generation. [arxiv]
  • Aspect Level Sentiment Classification with Deep Memory Network. [url]
  • Cached Long Short-Term Memory Neural Networks for Document-Level Sentiment Classification. [arxiv]
  • Character-Aware Neural Language Models. [pdf] ⭐
  • Character-based Neural Machine Translation. [arxiv] ⭐
  • Character-level and Multi-channel Convolutional Neural Networks for Large-scale Authorship Attribution. [arxiv]
  • Character-Level Language Modeling with Hierarchical Recurrent Neural Networks. [arxiv]
  • COCO-Text-Dataset and Benchmark for Text Detection and Recognition in Natural Images. [[pdf]](docs/2016/COCO-Text- Dataset and Benchmark for Text Detection and Recognition in Natural Images.pdf) [url]
  • Collaborative Recurrent Autoencoder: Recommend while Learning to Fill in the Blanks. [arxiv]
  • Context-aware Natural Language Generation with Recurrent Neural Networks. [arxiv]
  • Context-Dependent Word Representation for Neural Machine Translation. [arxiv]
  • [CLSTM] Contextual LSTM models for Large scale NLP tasks.[[pdf]](docs/2016/Contextual LSTM (CLSTM) models for Large scale NLP tasks.pdf) [url] ⭐
  • Convolutional Encoders for Neural Machine Translation. [url]
  • Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translatin. [url]
  • Deep Semi-Supervised Learning with Linguistically Motivated Sequence Labeling Task Hierarchies. [arxiv]
  • Detecting Text in Natural Image with Connectionist Text Proposal Network. [arxiv]
  • Diverse Beam Search: Decoding Diverse Solutions from Neural Sequence Models. [arxiv]
  • Dual Learning for Machine Translation. [url]
  • Efficient Character-level Document Classification by Combining Convolution and Recurrent Layers. [url]
  • End-to-End Answer Chunk Extraction and Ranking for Reading Comprehension. [arxiv]
  • End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF. [url] ⭐
  • Enhancing and Combining Sequential and Tree LSTM for Natural Language Inference. [arxiv]
  • Fast Domain Adaptation for Neural Machine Translation. [arxiv]
  • Fully Character-Level Neural Machine Translation without Explicit Segmentation. [url] ⭐
  • Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural Networks. [arxiv]
  • Generative Deep Neural Networks for Dialogue: A Short Review. [arxiv]
  • Generating Factoid Questions With Recurrent Neural Networks- The 30M Factoid Question-Answer Corpus. [[pdf]](docs/2016/Generating Factoid Questions With Recurrent Neural Networks- The 30M Factoid Question-Answer Corpus.pdf) [url] ⭐
  • Globally Normalized Transition-Based Neural Networks. [arxiv] [tensorflow] ⭐
  • Google's Multilingual Neural Machine Translation System- Enabling Zero-Shot Translation. [[pdf]](docs/2016/Google's Multilingual Neural Machine Translation System- Enabling Zero-Shot Translation.pdf) [url]
  • Google's Neural Machine Translation System- Bridging the Gap between Human and Machine Translation. [[pdf]](docs/2016/Google's Neural Machine Translation System- Bridging the Gap between Human and Machine Translation.pdf) [url] ⭐
  • How Grammatical is Character-level Neural Machine Translation? Assessing MT Quality with Contrastive Translation Pairs.[arxiv]
  • How NOT To Evaluate Your Dialogue System. [url] ⭐
  • Improving neural language models with a continuous cache.[url]
  • Inducing Multilingual Text Analysis Tools Using Bidirectional Recurrent Neural Networks. [pdf]
  • Key-Value Memory Networks for Directly Reading Documents. [[pdf]](docs/2016/Key-Value Memory Networks for Directly Reading Documents.pdf) [url]
  • Language Modeling with Gated Convolutional Networks.[arxiv] [tensorflow] ⭐
  • Learning Distributed Representations of Sentences from Unlabelled Data. [[pdf]](docs/2016/Learning Distributed Representations of Sentences from Unlabelled Data.pdf) [url] ⭐
  • Learning Recurrent Span Representations for Extractive Question Answering. [arxiv]
  • Learning to Compose Neural Networks for Question Answering. [[pdf]](docs/2016/Learning to Compose Neural Networks for Question Answering.pdf) [url] ⭐
  • Learning to Translate in Real-time with Neural Machine Translation. [[pdf]](docs/2016/Learning to Translate in Real-time with Neural Machine Translation.pdf) [url]
  • Linguistically Regularized LSTMs for Sentiment Classification. [arxiv]
  • Long Short-Term Memory-Networks for Machine Reading. [[pdf]](docs/2016/Long Short-Term Memory-Networks for Machine Reading.pdf) [url] ⭐
  • Modeling Coverage for Neural Machine Translation. [url] ⭐
  • Multilingual Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Models and Auxiliary Loss. [[pdf]](docs/2016/Multilingual Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Models and Auxiliary Loss.pdf) [url] ⭐
  • MultiNet: Real-time Joint Semantic Reasoning for Autonomous Driving. [arxiv] ⭐
  • Neural Architectures for Fine-grained Entity Type Classification. [url]
  • Neural Architectures for Named Entity Recognition.[[pdf]](docs/2016/Neural Architectures for Named Entity Recognition.pdf) [url] ⭐
  • Neural Emoji Recommendation in Dialogue Systems.[arxiv]
  • Neural Paraphrase Generation with Stacked Residual LSTM Networks. [arxiv]
  • Neural Machine Translation in Linear Time. [[pdf]](docs/2016/Neural Machine Translation in Linear Time.pdf) [url]
  • Neural Network Translation Models for Grammatical Error Correction. [url]
  • Neural Machine Translation with Latent Semantic of Image and Text. [arxiv]
  • Neural Machine Translation with Pivot Languages. [arxiv]
  • Neural Semantic Encoders. [url]
  • Neural Variational Inference for Text Processing. [arxiv] ⭐
  • Online Segment to Segment Neural Transduction. [arxiv]
  • On Random Weights for Texture Generation in One Layer Neural Networks.[arxiv]
  • Parallelizing Word2Vec in Shared and Distributed Memory.[arxiv] [github]
  • Phased LSTM: Accelerating Recurrent Network Training for Long or Event-based Sequences. [arxiv] [code]
  • Recurrent Memory Networks for Language Modeling. [url]
  • Recurrent Neural Machine Translation. [url]
  • Recurrent Neural Network Grammars. [url] ⭐
  • ReasoNet: Learning to Stop Reading in Machine Comprehension. [arxiv]
  • Scalable Bayesian Learning of Recurrent Neural Networks for Language Modeling. [arxiv]
  • Semi-Supervised Learning for Neural Machine Translation. [pdf]
  • Sentence Level Recurrent Topic Model- Letting Topics Speak for Themselves. [[pdf]](docs/2016/Sentence Level Recurrent Topic Model- Letting Topics Speak for Themselves.pdf) [url]
  • Sentence-Level Grammatical Error Identification as Sequence-to-Sequence Correction. [[pdf]](docs/2016/Sentence-Level Grammatical Error Identification as Sequence-to-Sequence Correction.pdf) [url]
  • Sentence Ordering using Recurrent Neural Networks. [arxiv]
  • Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation. [arxiv]
  • Sequential Match Network: A New Architecture for Multi-turn Response Selection in Retrieval-based Chatbots. [arxiv]
  • Structured Sequence Modeling with Graph Convolutional Recurrent Networks. arXiv. [arxiv]
  • Tracking the World State with Recurrent Entity Networks . [arxiv] ⭐
  • Tweet2Vec: Learning Tweet Embeddings Using Character-level CNN-LSTM Encoder-Decoder. arXiv.[arxiv] [code]
  • Unsupervised Learning of Sentence Representations using Convolutional Neural Networks. [url]
  • Unsupervised neural and Bayesian models for zero-resource speech processing. [arxiv]
  • Unsupervised Pretraining for Sequence to Sequence Learning. [url]
  • UTCNN: a Deep Learning Model of Stance Classificationon on Social Media Text. arXiv.[arxiv]
  • Very Deep Convolutional Networks for Natural Language Processing. [[pdf](docs/2016/Very Deep Convolutional Networks for Natural Language Processing.pdf)] [url] ⭐
  • Wide & Deep Learning for Recommender Systems. [arxiv] [tensorflow]:star:
  • Zero-Resource Translation with Multi-Lingual Neural Machine Translation. [[pdf](docs/2016/Zero-Resource Translation with Multi-Lingual Neural Machine Translation.pdf)] [url]

Generative learning

  • Adversarial Training Methods for Semi-Supervised Text Classification. [arxiv]
  • Generative Adversarial Text to Image Synthesis. [arxiv] ⭐
  • Modeling documents with Generative Adversarial Networks. [arxiv]
  • [StackGAN] StackGAN: Text to Photo-realistic Image Synthesis with Stacked Generative Adversarial Networks. [url] [code] ⭐

Attention and memory

  • A Context-aware Attention Network for Interactive Question Answering. [url]
  • A Decomposable Attention Model for Natural Language Inference. [arxiv] [code]
  • A self-attentive sentence embedding.[url]
  • AttSum: Joint Learning of Focusing and Summarization with Neural Attention. [arxiv]
  • Attention-over-Attention Neural Networks for Reading Comprehension. [arxiv] [github]
  • Coherent Dialogue with Attention-based Language Models. [arxiv]
  • Collective Entity Resolution with Multi-Focal Attention. [aclweb]
  • Gated-Attention Readers for Text Comprehension. [[pdf]](docs/2016/Gated-Attention Readers for Text Comprehension.pdf) [url]
  • Hierarchical Attention Networks for Document Classification. [url] ⭐
  • Hierarchical Memory Networks for Answer Selection on Unknown Words. [arxiv]
  • Implicit Distortion and Fertility Models for Attention-based Encoder-Decoder NMT Model. [arxiv]
  • Improving Attention Modeling with Implicit Distortion and Fertility for Machine Translation. [pdf]
  • Iterative Alternating Neural Attention for Machine Reading. [url]
  • Interactive Attention for Neural Machine Translation. [arxiv]
  • Joint CTC-Attention based End-to-End Speech Recognition using Multi-task Learning. [arxiv]
  • Key-Value Memory Networks for Directly Reading Documents. [arxiv]
  • Knowledge as a Teacher: Knowledge-Guided Structural Attention Networks. [arxiv]
  • Language to Logical Form with Neural Attention. [[pdf]](docs/2016/Language to Logical Form with Neural Attention.pdf) [url] ⭐
  • Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention. [[pdf]](docs/2016/Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention.pdf) [url]
  • Lexicon Integrated CNN Models with Attention for Sentiment Analysis. [arxiv]
  • Memory-enhanced Decoder for Neural Machine Translation. [url]
  • Multimodal Attention for Neural Machine Translation. [arxiv]
  • Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism. [url] ⭐
  • Neural Language Correction with Character-Based Attention. [url]
  • Neural Machine Translation with Recurrent Attention Modeling. [url]
  • Neural Machine Translation with Supervised Attention. [pdf]
  • Temporal Attention Model for Neural Machine Translation. [url]
  • Visualizing and Understanding Curriculum Learning for Long Short-Term Memory Networks. [arxiv]