Intuitively Explained: Multi-Task Learning — ERNIE 2.0 | State-of-the-Art NLP Architecture

The tech giant Baidu unveiled its state-of-the-art NLP architecture ERNIE 2.0 earlier this year, which scored »

Order Matters: Alibaba's Transformer-based Recommender System

Another step towards Alibaba’s Recommender System Domination Alibaba, the largest e-commerce platform in »

Generating High-Resolution Images Using Deep Autoregressive Models

Going beyond GANs and capturing the diversity of the true data distribution Generating High Fidelity Images »

Bayesian deep learning and near-term quantum computers: A cautionary tale in quantum machine learning

This blog post is an overview of quantum machine learning written by the author of the paper Bayesian deep »

Pre-training, Transformers, and Bi-directionality

Bidirectional Encoder Representations from Transformers BERT (Devlin et al., 2018) is a language »

Large-Scale Evolution of Image Classifiers

Deep neural networks excel in many difficult tasks, given large amounts of training data and enough »

Interpolation in Autoencoders via an Adversarial Regularizer

Adversarially Constrained Autoencoder Interpolation (ACAI; Berthelot et al., 2018) is a regularization »

GANs Need Some Attention, Too

Self-Attention Generative Adversarial Networks (SAGAN; Zhang et al., 2018) are convolutional neural networks »