Oğuz Kaan Yüksel

Oğuz Kaan Yüksel

Ph.D. student

EPFL

Biography

I am a second-year PhD student at EPFL in the Theory of Machine Learning Lab, advised by Nicolas Flammarion. I am currently working on the theory of self-supervised learning. I did some work on the dynamics of pretraining with meta-learning in my first year.

Previously, I was an M.Sc. student in Data Science and Mathematics (Minor) at EPFL. In my thesis, I studied extending random matrix theory results on kernel ridge regression to the ridgeless and negative ridge regimes at the Chair of Statistical Field Theory under the supervision of Berfin Şimşek and Arthur Jacot. During my studies, I also worked on low-rank matrix completion at the Chair of Continuous Optimization under the supervision of Nicolas Boumal and Quentin Rebjock, and on latent-space perturbations at the Machine Learning and Optimization Lab, advised by Tatjana Chavdarova and Sebastian U. Stich.

Before my M.Sc., I finished my Bachelor’s degree in Computer Engineering and Mathematics at Boğaziçi University, where I worked on adversarial robustness and orthogonality in neural networks, advised by İnci M. Baytaş, and on normalizing flows and unsupervised discovery of control for GANs, advised by Pınar Yanardağ.

Interests
  • Self-supervised learning
  • Meta-learning
Education
  • Ph.D. in EDIC, 2022-

    École Polytechnique Fédérale de Lausanne

  • M.Sc. in Data Science & Mathematics (minor), 2020-2022

    École Polytechnique Fédérale de Lausanne

  • B.Sc. in Computer Engineering & Mathematics, 2015-2020

    Boğaziçi University

Recent Publications

Discover all publications by filtering.
(2021). Semantic Perturbations with Normalizing Flows for Improved Generalization. ICCV.

PDF Cite Code Video

(2021). Semantic Perturbations with Normalizing Flows for Improved Generalization. INNF+ Workshop, ICML.

PDF Cite Code

(2021). [Re] Can gradient clipping mitigate label noise?. ReScience C.

PDF Cite Code

(2020). Adversarial Training with Orthogonal Regularization. SIU.

PDF Cite