ML@GT Seminar: Mike Rabbat

Bio: Mike Rabbat is a Research Scientist in the Facebook AI Research group. He is currently on leave from McGill University where he is an Associate Professor of Electrical and Computer Engineering. Mike received a Masters from Rice University in 2003 and a PhD from the University of Wisconsin in 2006, both under the supervision of Robert Nowak. Mike’s research interests are in the areas of networks, statistical signal            processing, and machine learning. Currently, he is working on gossip algorithms for distributed processing, distributed tracking, and algorithms and theory for signal processing on graphs.

 

Title: Asynchronous Subgradient-Push

 

Abstract: We consider a multi-agent framework for distributed optimization where each agent in the network has access to a local convex function and the collective goal is to achieve consensus on the parameters that minimize the sum of the agents' local functions.  We propose an algorithm wherein each agent operates asynchronously and independently of the other agents in the network. When the local functions are strongly-convex with Lipschitz-continuous gradients, we show that a subsequence of the iterates at each agent converges to a neighbourhood of the global minimum, where the size of the neighbourhood depends on the degree of asynchrony in the multi-agent network. When the agents work at the same rate, convergence to the global minimizer is achieved. Numerical experiments demonstrate that Asynchronous Subgradient-Push can minimize the global objective faster than state-of-the-art synchronous first-order methods, is more robust to failing or stalling agents, and scales better with the network size. This is joint work with Mahmoud Assran.

 

 

 

Event Details

Date/Time:

  • Wednesday, April 4, 2018
    1:00 pm - 2:30 pm

Related Media

Click on image(s) to view larger version(s)

  • Machine Learning

For More Information Contact

Kyla Hanson: khanson@cc.gatech.edu