Jiacheng Chen

Welcome!

I am an incoming PhD student at the University of Toronto. I am currently a senior undergrad student in the School of Computer Science and Technology, South China University of Technology. I am honored to be advised by Prof. Yue-Jiao Gong, and fortunate to work closely with Zeyuan Ma and Hongshu Guo. We co-founded MetaEvo research team aiming to make further exploration in Meta-Black-Box optimization.

I was an SURF fellow in Computing + Mathematical Sciences (CMS) Department at Caltech, where I am honored to be advised by Prof. Yisong Yue and Dr. Kaiyu Yang and work on AI4Math research project.

My research interests include๏ผŒ

  • AI for Math (Formal Theorem Proving; Reasoning; โ€ฆ)
  • Reinforcement Learning
  • Learning to Optimize

๐Ÿ”ฅ News

๐Ÿ“ Publications

Preprint
sym

The Entropy Mechanism of Reinforcement Learning for Reasoning Language Models

Ganqu Cui*, Yuchen Zhang*, Jiacheng Chen*, Lifan Yuan, Zhi Wang, Yuxin Zuo, Haozhan Li, Yuchen Fan, Huayu Chen, Weize Chen, Zhiyuan Liu, Hao Peng, Lei Bai, Wanli Ouyang, Yu Cheng, Bowen Zhou, Ning Ding.

  • We conducted empirical and theoretical analysis at โ€œentropy collapseโ€ phenomena, and proposed new way of entropy control.
NeurIPS 2024 Workshop MATH-AI
sym

Reasoning in Reasoning: A Hierarchical Framework for Neural Theorem Proving (NeurIPS 2024 Workshop MATH-AI)

Ziyu Ye, Jiacheng Chen, Jonathan Light, Yifei Wang, Jiankai Sun, Mac Schwager, Philip Torr, Guohao Li, Yuxin Chen, Kaiyu Yang, Yisong Yue, Ziniu Hu.

ICLR 2024
sym

SYMBOL: Generating Flexible Black-Box Optimizers through Symbolic Equation Learning (ICLR 2024)

Jiacheng Chen*, Zeyuan Ma*, Hongshu Guo, Yining Ma, Jie Zhang, Yue-Jiao Gong.

Project

  • Unlike previous methods incrementally auto-configuring some existing black-box algortithms, SYMBOL directly generate stepwise update rule in the form of symbolic eqution to achieve more flexible and interpretable optimization behaviour.
Preprint
sym

LLaMoCo: Instruction Tuning of Large Language Models for Optimization Code Generation

Zeyuan Ma, Hongshu Guo, Jiacheng Chen, Guojun Peng, Zhiguang Cao, Yining Ma, Yue-Jiao Gong.

  • We propose to fine-tune language model to generate executable code that can be used for optimization tasks. We proposed a dataset that containing diversed optimization problems and corresponding algorithm in this paper, also leverage some tricks during training process and finally provided a fine-tuned LM for optimization tasks.
NeurIPS 2023
sym

MetaBox: A Benchmark Platform for Meta-Black-Box Optimization with Reinforcement Learning (Neurips 2023 Oral)

Zeyuan Ma, Hongshu Guo, Jiacheng Chen, Zhenrui Li, Guojun Peng, Yue-Jiao Gong, Yining Ma, Zhiguang Cao.

Project

  • We released a benchmark platform for Meta-Black-Box Optimization named MetaBox. We integrate three different testsuits, about 20 baselines including traditional black-box methods and Meta-Black-Box methods, and new evaluation metrics tailored for Meta-Black-Box optimization. The codebase can be found here.
ICLR 2025
sym

Neural Exploratory Landscape Analysis (ICLR 2025)

Zeyuan Ma, Jiacheng Chen, Hongshu Guo, Yue-Jiao Gong.

  • We developed an Neural-Network based lanscape analyser to replace the feature-extracting parts in Meta-Black-Box works which is usually manually designed. To ensure the generalization ability of the NeurELA, we let it operate in Multi-task setting and use neuroevolution to train it.
GECCO 2024
sym

Auto-configuring Exploration-Exploitation Tradeoff in Evolutionary Computation via Deep Reinforcement Learning (GECCO 2024)

Zeyuan Ma*, Jiacheng Chen*, Hongshu Guo, Yining Ma, Yue-Jiao Gong.

  • We explore about how to make a trade-off between exploration and exploitation in Black-Box optimization through learn-based method. In this work, we carefully designed a framework which is based on transfromer-based model and leverage exploration-exploitation related feature tailored for black-box optimization scenario to resolve this problem.

๐ŸŽ– Honors and Awards

  • 2024, Caltech Summer Undergraduate Research Fellowship (SURF).

๐Ÿ“– Educations

  • 2021.09 - present, School of Computer Science and Technology, South China University of Technology.
sym

๐Ÿ’ป Research Experience

sym
  • 2022-03 - 2023.03, SRP, SCUT.
sym