Jikai Jin

Jikai Jin

Ph.D. student

Stanford university


Welcome to my personal website! I am a first-year Ph.D. student in The Institute for Computational and Mathematical Engineering (ICME), Stanford university. Prior to joining Stanford, I obtained my bachelor degree in computational mathematics in the School of Mathematical Sciences, Peking University, fortunately having Prof. Liwei Wang as my research advisor. My research is highly interdisciplinary across machine learning, statistics and operations research. While currently focusing on theoretical aspects, the ultimate goal of my research is to develop state-of-the-art solutions for important real-world problems. If you share similar interest, feel free to contact me via email or Wechat. Download my resumé.

  • Statistics (non-parametric estimation and causal inference)
  • Operations research (experiment design, uncertainty in decision-making)
  • Machine Learning (theory, algorithms, social aspects and causal ML)
  • Ph.D. in Computational and Mathematical Engineering, 2023 - 2028 (expected)

    Stanford University

  • BSc in Computational Mathematics, 2019 - 2023

    Peking University

  • High School Diploma, 2017 - 2019

    No.2 High School of East China Normal University

Recent News

All news»

Nov. 2023 New paper Dichotomy of Early and Late Phase Implicit Biases Can Provably Induce Grokking with Keifeng Lyu, Zhiyuan Li, Simon S. Du, Jason D. Lee and Wei Hu posted on ArXiv! Welcome any feedback or comments.

Nov. 2023 New paper Learning Causal Representations from General Environments: Identifiability and Intrinsic Ambiguity with Vasilis Syrgkanis posted on ArXiv! Welcome any feedback or comments.

Oct. 2023 Give a talk at the 2023 INFORMS Annual Meeting based on our incremental learning paper. Had a wonderful time in Phoenix!

Sept. 2023 Arriving at Stanford and start my Ph.D. journey!

Jul. 2023 Graduate from Peking University! Will miss the time spent there.

Recent Publications

Quickly discover relevant content by filtering publications.
(2023). Dichotomy of Early and Late Phase Implicit Biases Can Provably Induce Grokking. In ArXiv preprint 2311.18817.


(2023). Understanding Incremental Learning of Gradient Descent -- A Fine-grained Analysis of Matrix Sensing. In ICML 2023.

PDF Cite ArXiv Poster

(2022). Minimax Optimal Kernel Operator Learning via Multilevel Training. In ICLR 2023 (spotlight).

PDF Cite ArXiv Slides Poster

(2022). Why Robust Generalization in Deep Learning is Difficult: Perspective of Expressive Power. In NeurIPS 2022.

PDF Cite ArXiv

(2021). Non-convex Distributionally Robust Optimization: Non-asymptotic Analysis. In NeurIPS 2021.

PDF Cite ArXiv

(2020). Improved analysis of clipping algorithms for non-convex optimization. In NeurIPS 2020.

PDF Cite ArXiv


advised by Prof. Liwei Wang (Peking University)
Unergrduate Research Intern
advised by Prof. Liwei Wang (Peking University)
Feb 2020 – Present Beijing, China
Work on machine learning theory.

Awards and Honors

All awards and honors»

Jun. 2023 Huaixin Scholar, BICMR

Jun. 2023 Peking University Excellent Graduate.

Jan. 2023 Sensetime Scholarship (awarded to 30 Chinese undergraduate students in the field of AI).

Oct. 2022 Qin Wanshun-Jin Yunhui Scholarship, Peking University.

Oct. 2022 Merit Student, Peking University.

Oct. 2021 Exceptional award for academic innovation, Peking University.

Jun. 2021 Elite undergraduate training program of Applied Mathematics and Statistics.

May 2021 Bronze Medal, S.T. Yau College Student Mathematics Competition, probability and statistics individual.

Dec. 2020 Qin Wanshun-Jin Yunhui Scholarship, Peking University.

Oct. 2020 Yizheng Scholarship, Peking University.

Feb. 2019 Silver Medal, 11th Romania Masters of Mathematics.

Oct. 2018 First Prize (ranked No.6), Chinese Mathematical Olympiad (CMO).

Oct. 2017 First Prize (ranked No.13), Chinese Mathematical Olympiad (CMO).