Ting-Han Fan

I am a Member of Technical Staff on the Grok pre-training team at xAI. Previously, I was a Research Scientist at ByteDance Seed, working on LLM function calling across mid-training and post-training. Before that, I was a Machine Learning Engineer at TikTok E-commerce Recommendation, a Summer Associate at Goldman Sachs, and a Research Intern at Siemens and Microsoft.

I completed my M.A. and Ph.D. in Electrical and Computer Engineering at Princeton University, where I was very fortunate to be advised by Prof. Peter J. Ramadge. Prior to Princeton, I received my B.S. in Electrical Engineering from National Taiwan University.

email: tinghanfan at gmail dot com

Find me at Google Scholar and LinkedIn.



Publications:

At ByteDance Seed

Length Extrapolation of Transformers

Regular Language Reasoning

Reparameterization for Discrete Deep Generative Models

Reinforcement Learning for Power Distribution System Controls

Model-based Reinforcement Learning

(* denotes equal contribution)


Patent Applications:



Academic Activities:

Reviewer

  • NeurIPS 2022-2025, ICLR 2024-2026, ICML 2022-2025, ISIT 2024, L4DC 2023, AISTATS 2021 & 2024
  • ACL Rolling Review: 2023 December - 2024 December

Teaching Assistant at Princeton University

  • ECE 435/535: Machine Learning and Pattern Recognition, Fall 2019, Fall 2020, Fall 2022 (head TA)
  • EGR 154: Foundations of Engineering: Linear Systems, Spring 2022
  • COS 302: Mathematics for Numerical Computing and Machine Learning, Fall 2021 (head TA)
  • SML 310: Research Projects in Data Science, Spring 2021 (head TA)
  • ECE 201: Information Signals, Spring 2020, Spring 2023