Research

Interests

My interests include Scientific Machine Learning, Gaussian Processes and Probabilistic Programming. I am also interested in the role of Automatic Differentiation in furthering SciML capabilities.

Publications

Google Scholar

  • Composable and reusable neural surrogates to predict system response of causal model components
    Ranjan Anantharaman, Anas Abdelrehim, Francesco Martinuzzi, Sharan Yalburgi, Elliot Saba, Keno Fischer, Glen Hertz, Pepijn de Vos, Chris Laughman, Yingbo Ma, Viral Shah, Alan Edelman, Chris Rackauckas AAAI 2022 Workshop on AI for Design and Manufacturing (ADAM)

  • Assessing Inference Quality for Probabilistic Programs using Multivariate Simulation Based Calibration
    Sharan Yalburgi, Jameson Quinn, Veronica Weiner, Sam A Witty, Vikash Mansinghka, Cameron Freer International Conference on Probabilistic Programming (PROBPROG)

  • An Empirical Study of Iterative Knowledge Distillation for Neural Network Compression
    Sharan Yalburgi, Tirtharaj Dash, Ramya Hebbalaguppe, Srinidhi Hegde, Ashwin Srinivasan.
    28th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, 2020

  • Bijectors.jl: Flexible transformations of distributions
    Tor Erlend Fjelde, Kai Xu, Mohamed Tarek, Sharan Yalburgi, Hong Ge.
    2nd Symposium on Advances in Approximate Bayesian Inference 2019 co-located with NeurIPS, 2019
    GitHub | PDF