Mehrdad Farajtabar, Ph.D.
Senior Research Scientist
Title: Neural Networks in real world data: the case of continual learning and knowledge distillation
Abstract: Neural networks are achieving state of the art and sometimes superhuman performance on learning tasks across a variety of domains. However, are they ready to be fully utilized in real world scenarios where the data distributions are evolving or there is a resource constraint? Whenever it requires learning in a continual or sequential manner, neural networks suffer from the problem of catastrophic forgetting; they forget how to solve previous tasks after being trained on a new task, despite having the essential capacity to solve both tasks if they were trained on both simultaneously. In this talk, we introduce this phenomena and propose a few methods to address this issue from the perspective of neural networks training dynamics. Next, we study knowledge distillation, a way to transfer the knowledge of large neural networks to smaller ones for improved inference time or computation.
About Mehrdad Farajtabar: Mehrdad Farajtabar is a senior research scientist at Google DeepMind working on machine learning and applications. His recent research interests are continual learning of neural networks, learning under eveloing data distributions and reinforcement learning. Before joining DeepMind he graduated with PhD in computational science and engineering from Georgia Tech in 2018 and holds M.Sc. and B.Sc. degrees in Artificial Intelligence and Software Engineering from Sharif University of Technology. Website: https://farajtabar.github.io/
Department of Industrial & Systems Engineering