About Research+Code Blog

NLP Papers


The general theme that these papers follow is general purpose text representations going over different:

1. Levels of granularity i.e., words, sentences, paragraphs, documents.

2. Objective functions.

3. Downstream tasks best tuned for.

4. Model architectures used.

There is no particular order. Happy reading!



Deep contextualised word representations. Peters et. al., NAACL 2018.

Context is Everything: Finding Meaning Statistically in Semantic Spaces. Eric Zelikman.

Skip-Thought Vectors. Kiros et. al., NIPS 2015.

Recursive deep models for semantic compositionality over a sentiment treebank.
Socher et. al., EMNLP 2013.

Fine-grained analysis of sentence embeddings using auxiliary prediction tasks.
Adi et. al. ICLR 2017.

A simple but tough-to-beat baseline for sentence embeddings.
Arora et. al., ICLR 2017.

A convolutional neural network for modelling sentences.
Kalchbrenner et. al., ACL 2014.

An Efficient Framework for Learning Sentence Representations.
Logeswaran and Lee, ICLR 2018.

Efficient estimation of word representations in vector space.
Mikolov et. al., ICLR 2013.

Siamese cbow: Optimizing word embeddings for sentence representations.
Kenter et. al., ACL 2016.

Distributed representations of sentences and documents.
Le and Mikolov, ICML 2014.

A model of coherence based on distributed sentence representation.
Li and Hovy, EMNLP 2014.

Efficient estimation of word representations in vector space.
Mikolov et. al., ICLR 2013.

Distributed representations of words and phrases and their compositionality.
Mikolov et. al., NIPS 2013.

Glove: Global vectors for word representation.
Pennington et. al., EMNLP 2014.

Towards universal paraphrastic sentence embeddings.
Wieting et. al., ICLR 2016.

Self-adaptive hierarchical sentence model. Zhao et. al., IJCAI 2015.

Learning composition models for phrase embeddings.
Yu and Dredze, ACL 2015.

Deep contextualized word representations. Peters et. al., NAACL 2018.

.Enriching word vectors with subword information.
Bojanowski et. al., TACL 2017.

A structured self-attentive sentence embedding.
Lin et. al., ICLR 2017.

Learning Distributed Representations of Sentences from Unlabelled Data.
Hill et. al., ACL 2016.

Learning to understand phrases by embedding the dictionary.
Hill et. al., ACL 2015.

Supervised Learning of Universal Sentence Representations from Natural Language Inference Data.
Conneau et. al., EMNLP 2017.