NIPS Proceedingsβ

Michael I. Jordan

22 Papers

  • Fast Black-box Variational Inference through Stochastic Trust-Region Optimization (2017)
  • Gradient Descent Can Take Exponential Time to Escape Saddle Points (2017)
  • Kernel Feature Selection via Conditional Covariance Minimization (2017)
  • Non-convex Finite-Sum Optimization Via SCSG Methods (2017)
  • Online control of the false discovery rate with decaying memory (2017)
  • Cyclades: Conflict-free Asynchronous Machine Learning (2016)
  • Local Maxima in the Likelihood of Gaussian Mixture Models: Structural Results and Algorithmic Consequences (2016)
  • Unsupervised Domain Adaptation with Residual Transfer Networks (2016)
  • Linear Response Methods for Accurate Covariance Estimates from Mean Field Variational Bayes (2015)
  • On the Accuracy of Self-Normalized Log-Linear Models (2015)
  • Parallel Correlation Clustering on Big Graphs (2015)
  • Variational Consensus Monte Carlo (2015)
  • Communication-Efficient Distributed Dual Coordinate Ascent (2014)
  • On the Convergence Rate of Decomposable Submodular Function Minimization (2014)
  • Parallel Double Greedy Submodular Maximization (2014)
  • Spectral Methods meet EM: A Provably Optimal Algorithm for Crowdsourcing (2014)
  • A Comparative Framework for Preconditioned Lasso Algorithms (2013)
  • Estimation, Optimization, and Parallelism when Data is Sparse (2013)
  • Information-theoretic lower bounds for distributed statistical estimation with communication constraints (2013)
  • Local Privacy and Minimax Bounds: Sharp Rates for Probability Estimation (2013)
  • Optimistic Concurrency Control for Distributed Unsupervised Learning (2013)
  • Streaming Variational Bayes (2013)