Zhilin Yang

Archives



01/2019
Transformer-XL updated and released! Along with code and pretrained models. SoTA on enwik8, text8, One Billion Word, WikiText-103.

09/2018
HotpotQA dataset released! Along with code, blog posts, and website.

09/2018
Paper accepted at NIPS 2018.
[PDF]

08/2018
Two papers accepted at EMNLP 2018.
[PDF] [PDF]

05/2018
Starting my internship at Google Brain, mentored by Quoc V. Le.

05/2018
Paper on unsupervised relational graph learning released.
[PDF]

01/2018
One oral (acceptance rate 2%) and one poster at ICLR 2018.
[PDF] [PDF]

11/2017
Paper on Mechanical Turker Descent released.
[PDF]

11/2017
Paper and code on language modeling released. SOTA on Penn Treebank and WikiText-2.
[PDF] [Code]

11/2017
Code on GAN-based semi-supervised learning released. SOTA on MNIST, SVHN, and CIFAR-10 with standard architectures.

09/2017
Two papers accepted at NIPS 2017.
[PDF] [PDF]

09/2017
Code and data on Transfer Learning released.

05 - 08/2017
Internship at Facebook AI Research, mentored by Jason Weston.

06/2017
Dataset on Semi-Supervised QA released.

05/2017
Paper on Semi-Supervised Learning with Bad GANs released. SOTA on MNIST, SVHN, and CIFAR-10 with standard architectures.
[PDF]

05/2017
Paper on Differentiable Rule Learning updated. SOTA on Wordnet, Freebase, and WikiMovies.
[PDF]

03/2017
Paper on Semi-Supervised QA with Generative Domain-Adaptive Nets accepted by ACL 2017 .
[PDF]

03/2017
Paper on Gated Attention Readers accepted by ACL 2017 .
[PDF]

02/2017
Two papers accepted at ICLR 2017.

11/2016
Two papers on reading comprehension released/updated.

10/2016
Code and data for our NIPS 2016 paper released.

08/2016
Paper on a novel encoder-decoder architecture accepted by NIPS 2016 .
[PDF]

06/2016
Code for our ICML 2016 paper released.

05/2016
Code and data for our IJCAI 2016 paper released.

04/2016
Paper on semi-supervised learning accepted by ICML 2016 .
[PDF]

04/2016
Paper on multi-modal Bayesian embeddings accepted by IJCAI 2016 .

03/2016
Paper on sequence tagging released.
[PDF]