About Me

I am a research assistant professor and the IEMS alumni fellow at McCormick School of Engineering at Northwestern University. Here is my CV.

(I am temporarily visiting Purude University as a visiting assistant professor)

Contact

imon dot banerjee dot northwestern dot edu


I love learning about new topics, and find research extremely fun. Before joining Northwestern, I was a PhD student under Vinayak Rao and Harsha Honnappa at Purdue University. Here is a link to my thesis. Prior to that, I completed my Bachelors and Masters in Mathematical Statistics from Indian Statistical Institute.

My Erdős number is 3. Imon Banerjee —> Diego Klabjan —> Criag A. Tovey —> Paul Erdős.

Spring 2026: The new year started well with two acceptances in AISTATs. One of them was “Meta Sparse Principal Component Analysis” (We pushed it for a really long time!) and the other was “Nonparametric Multi Change Point Detection for Markov Chains via Adaptive Clustering” which we had also submitted. I also worked with Ramkrishna Samanta and Sayak Chakraborty to submit a paper on the characterisation of rare events in stationary MDP’s and some downstream applications. This should wrap up all conference submissions for 2025-2026. In future, I hope to work on some more statistical topics by going back to some of my roots in variational inference. I also realised that Fuk-Naegev Inequalities for the empirical suprema of Markov chains are not available!!! (I am happy to describe this more if you reach out to me.)

Fall 2025: In fall, I was reading about geometrical interpretations of statistics, pioneered by Efron in papers 1, 2, and further developed more recently in 3. The book by Kass and Vos (link) provides a good introduction. I am also generally interested in geometry. I am also learning about new and upcoming methods in nonparametric Statistics, namely $\rho$-estimators 1. I have worked on two projects related to their predecessors, namely T-estimators in Markov chains (preprint for one of those). So if you have ideas and think I can help, shoot me an email. I would be most happy to discuss.

Summer 2025

Spring and Summer of 2025 went really quickly working on two statistical problems on regenerating Markov chains. One of them was spent on exploring the m-out-of-n counterpart of an interesting phenomenon for the Edgeworth expansion of Bootstrapped Studentized sample quantiles first documented by Hall and Martin . Along the way, we also proved guarantees beyond i.i.d. data, namely when it is generated from a regenerating Markov chain. This paper was accepted at NeurIPS 2025 (arXiv).

We also derived a new method for offline change point detection (preprint) when the data arrives from a regenerating Markov chain. I am pretty excited about this work, especially since we have found a mixed integer linear program to efficiently solve this objective, which is quite different from existing methods like PELT or Binary Segmentation. Along the way, we also found out that Bennett-type inequalities for the empirical suprema of Markov chains have been an open problem for nearly 2 decades now!!!

Spring 2025

I worked on a non-Gaussian extension of the Kalman filter (see this) with Itai Gurvich where we showed that the Kalman filter (which is the best linear unbiased estimator) is actually sub-optimal in estimating the state when the noise is non-Gaussian. We then propose a modified estimator based on a prescription by Eimear Goggin, which performs an order of magnitude better than the Kalman filter.

Prior Work

The main thrust of my research during my PhD was in Markov chains, with both Bayesian and Frequentist flavors. In a trio of papers, one of which is published and two of which are upcoming, I rigorously developed statistics on controlled Markov chains under both finite and continuous state spaces. I am currently collecting these three works into a monograph, which I plan to make available on my website.