I am a PhD student (since 2015) in the Department of Computer Science at Tsinghua University, advised by Jun Zhu. I have broad interests in probabilistic methods and approximate Bayesian inference, including and not limited to these topics: probabilistic kernel methods (e.g., Gaussian processes), spectral methods, variational inference, generative models, and Bayesian deep learning.

I’m currently a research intern at DeepMind, London. Previously I was an intern at RIKEN-AIP, Tokyo. I was awarded the Microsoft Research Asia Fellowship for 2018. I received my B.E. from the Department of Computer Science and Technology at Tsinghua University.

Github Twitter

Research Highlights

MORE

Scalable Training of Inference Networks for Gaussian-Process Models

Jiaxin Shi, Mohammad Emtiyaz Khan, and Jun Zhu.

International Conference on Machine Learning (ICML), 2019. [pdf] [arxiv] [code]

Functional Variational Bayesian Neural Networks

Shengyang Sun*, Guodong Zhang*, Jiaxin Shi*, Roger Grosse.

International Conference on Learning Representations (ICLR), 2019. [pdf] [arxiv] [code]

A Spectral Approach to Gradient Estimation for Implicit Distributions

Jiaxin Shi, Shengyang Sun, and Jun Zhu.

International Conference on Machine Learning (ICML), 2018. [pdf] [arxiv] [code]

Sliced Score Matching: A Scalable Approach to Density and Score Estimation

Yang Song*, Sahaj Garg*, Jiaxin Shi, Stefano Ermon.

The 35th Conference on Uncertainty in Artificial Intelligence (UAI), 2019. [pdf] [arxiv] [code]

Semi-crowdsourced Clustering with Deep Generative Models

Yucen Luo, Tian Tian, Jiaxin Shi, Jun Zhu and Bo Zhang.

Neural Information Processing Systems (NeurIPS), 2018. [pdf] [arxiv] [code]

Software

I’m currently leading the development of ZhuSuan, a probabilistic programming library based on Tensorflow with a particular focus on Bayesian deep learning.

[github] [docs] [white paper]

Curriculum Vitae

My CV can be downloaded from this link: [pdf]