I am a Postdoctoral Scholar with the Departments of Computer Science and Statistics at Stanford University. I work with Chris Ré and Lester Mackey. I got my PhD at The University of Texas at Austin with Constantine Caramanis and Sriram Vishwanath, where I also worked with Alex Dimakis.

I am currently on the academic job market: CV and research statement.

Research Interests

Machine Learning, High Dimensional Statistics, Large-scale Distributed Computation.

Recent interests include asynchronous optimization and scan orders for Gibbs sampling. I have worked on resource-limited problems like memory-limited streaming PCA as well as streaming PCA with overwhelming erasures in the entries of each sample.

I also like breaking things in a way that makes them better/stronger/faster. FrogWild!, our work on PageRank approximation is a good example of this. Therein, working both atop of GraphLab and under the hood, we manage to get an improvement of 7x.

I have worked on most large scale computation platforms, from MPI to MapReduce and from GraphLab/Giraph to Spark. Lately, though, I have been looking at multi-core impementations with renewed interest.

Throughout our work I strive to bring together our very real implementation and the theory that will guarantee we -- most of the time! -- get a good result.

That's me.

Recent Projects

Recent News

  • January 2017: Visiting Microsoft Research, Cambridge
  • December 2016: In Barcelona for NIPS.
  • November 2016: Visiting Microsoft Research New England
  • November 2016: Full version of asynchrony paper.
  • September 2016: Talk at Allerton
  • August 2016: I had the pleasure to give a talk MIT Lincoln Labs.
  • August 2016: Gave an asynchronous optimization talk at Google.
  • August 2016: Blog post on our momentum work.
  • July 2016: Scan order paper accepted at NIPS 2016!
  • July 2016: Invited to talk at NVIDIA.
  • June 2016: Poster at non-convex optimization ICML workshop.
  • June 2016: Poster at OptML 2016 workshop.
  • In a recent note, we show that asynchrony in SGD introduces momentum. In the companion systems paper, we use this theory to train deep networks faster.
  • Our paper dispelling some common beliefs regarding the scan order in Gibbs sampling.
  • Does periodic model averaging always help? Recent results.
  • Excited to start Postdoc at Stanford University. Will be working with Lester Mackey and Chris Ré.
  • Successfully defended my PhD thesis!
  • SILO seminar talk at the Wisconsin Institute of Discovery. Loved both Madison and the WID!
  • Densest k-Subgraph work picked up by NVIDIA!
  • Our latest work has been accepted for presentation at VLDB 2015!


Asynchrony begets Momentum, with an Application to Deep Learning
Ioannis Mitliagkas, Ce Zhang, Stefan Hadjis, and Christopher Ré. Allerton, arXiv:1605.09774v2 (2016) [.pdf ]
Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much
Bryan He, Christopher De Sa, Ioannis Mitliagkas, and Christopher Ré NIPS 2016, arXiv:1606.03432 (2016) [.pdf ]
Omnivore: An Optimizer for Multi-device Deep Learning on CPUs and GPUs
Stefan Hadjis, Ce Zhang, Ioannis Mitliagkas, and Christopher Ré arXiv preprint arXiv:1606.04487 (2016) [.pdf ]
Parallel SGD: When does averaging help?
Jian Zhang, Christopher De Sa, Ioannis Mitliagkas, and Christopher Ré OptML workshop at ICML 2016, arXiv:1606.07365 (2016) [.pdf ]
FrogWild! Fast PageRank Approximations on Graph Engines
Ioannis Mitliagkas, Michael Borokhovich, Alex Dimakis, and Constantine Caramanis. VLDB 2015 (Earlier version at NIPS 2014 workshop). [ bib | .pdf ]
Streaming PCA with Many Missing Entries
Ioannis Mitliagkas, Constantine Caramanis, and Prateek Jain. Preprint, 2015. [ bib | .pdf ]
Finding Dense Subgraphs via Low-rank Bilinear Optimization
Dimitris S Papailiopoulos, Ioannis Mitliagkas, Alexandros G Dimakis, and Constantine Caramanis. ICML, 2014. [ bib | .pdf ]
Memory Limited, Streaming PCA
Ioannis Mitliagkas, Constantine Caramanis, and Prateek Jain. NIPS 2013 (arXiv:1307.0032), 2013. [ bib | .pdf ]
User Rankings from Comparisons: Learning Permutations in High Dimensions
I. Mitliagkas, A. Gopalan, C. Caramanis, and S. Vishwanath. In Proc. of Allerton Conf. on Communication, Control and Computing, Monticello, USA, 2011. [ bib | .pdf ]
Joint Power and Admission Control for Ad-hoc and Cognitive Underlay Networks: Convex Approximation and Distributed Implementation
I. Mitliagkas, ND Sidiropoulos, and A. Swami. IEEE Transactions on Wireless Communications, 2011. [ bib ]
Strong Information-Theoretic Limits for Source/Model Recovery
I. Mitliagkas and S. Vishwanath. In Proc. of Allerton Conf. on Communication, Control and Computing, Monticello, USA, 2010. [ bib | .pdf ]
Distributed joint power and admission control for ad-hoc and cognitive underlay networks
I. Mitliagkas, ND Sidiropoulos, and A. Swami. In Acoustics Speech and Signal Processing (ICASSP), 2010 IEEE International Conference on, pages 3014-3017. IEEE. [ bib ]
Convex approximation-based joint power and admission control for cognitive underlay networks
I. Mitliagkas, ND Sidiropoulos, and A. Swami. In Wireless Communications and Mobile Computing Conference, 2008. IWCMC'08. International, pages 28-32. IEEE. [ bib ]


Git repositories for the projects I'm working on.

subscribe via RSS