Preprints & Working Papers

Reconstructing probabilistic trees of cellular differentiation from single-cell RNA-seq data

arXiv:1811.11790 [q-bio.QM], 2018.

Preprint PDF

Practical bounds on the error of Bayesian posterior approximations: A nonasymptotic approach

arXiv:1809.09505 [stat.TH], 2018.

Preprint PDF

Publications

Data-dependent compression of random features for large-scale kernel approximation

In Proc. of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.

Preprint PDF

Scalable Gaussian process inference with finite-data mean and variance guarantees

In Proc. of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.

Preprint PDF

Truncated random measures

Bernoulli, In press.

Preprint PDF Slides

Random feature Stein discrepancies

In Proc. of the 32nd Annual Conference on Neural Information Processing Systems (NeurIPS), 2018.

Preprint PDF Code

PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference

In Proc. of the 31st Annual Conference on Neural Information Processing Systems (NeurIPS), 2017.

Preprint PDF Code Slides

Quantifying the accuracy of approximate diffusions and Markov chains

In Proc. of the 19th International Conference on Artificial Intelligence and Statistics (AISTATS), 2017.

Preprint PDF

Coresets for scalable Bayesian logistic regression

In Proc. of the 30th Annual Conference on Neural Information Processing Systems (NeurIPS), 2016.

Preprint PDF

JUMP-Means: small-variance asymptotics for Markov jump processes

In Proc. of the 32nd International Conference on Machine Learning (ICML), 2016.

Preprint PDF Code

Risk and regret of hierarchical Bayesian learners

In Proc. of the 32nd International Conference on Machine Learning (ICML), 2016.

Preprint PDF

Fast Kalman filtering and forward-backward smoothing via a low-rank perturbative approach

Journal of Computational and Graphical Statistics, 23(2): 316–339, 2014.

PDF Supplementary Material

A statistical learning theory framework for supervised pattern discovery

In Proc. of SIAM International Conference on Data Mining (SDM), 2014.

PDF

Fast state-space methods for inferring dendritic synaptic connectivity

Journal of Computational Neuroscience, 36(3): 415–443, 2014.

PDF Supplementary Material

Optimal experimental design for sampling voltage on dendritic trees in the low-SNR regime

Journal of Computational Neuroscience, 32(2): 347–366, 2012.

PDF

Thesis

Scaling Bayesian inference: theoretical foundations and practical methods

Ph.D. thesis, Massachusetts Institute of Technology, 2018.

PDF

Miscellanea

Detailed Derivations of Small-variance Asymptotics for some Hierarchical Bayesian Nonparametric Models

arXiv:1501.00052 [stat.ML], 2014.

Preprint PDF

Infinite Structured Hidden Semi-Markov Models

arXiv:1407.0044 [stat.ME], 2014.

Preprint PDF

Short Bio

Jonathan Huggins is a Postdoctoral Research Fellow in the Department of Biostatistics at Harvard. He is also affiliated with the Dana-Farber Cancer Institute and the Broad Institute of MIT and Harvard. He completed his Ph.D. in Computer Science at the Massachusetts Institute of Technology in 2018. Previously, he received a B.A. in Mathematics from Columbia University and an S.M. in Computer Science from the Massachusetts Institute of Technology. His recent research has focused on developing reliable approximate Bayesian inference methods that are scalable to large datasets and complex models.

Contact