I am Futing Wang, a second-year Ph.D. student at LINs Lab, Westlake University through a joint program with Zhejiang University, under the supervision of Prof.Tao Lin. Before that, I earned my Bachelor’s degree at Shanghai University.
My research focuses on advancing large language models (LLMs), with particular emphasis on modular capabilities for language models, representation engineering, and efficient model adaptation and training techniques.
📝 Publications (* denotes equal contribution)
ICLR 2025

ELICIT: LLM Augmentation via External In-context Capability
Futing Wang*, Jianhao Yan*, Yue Zhang, Tao Lin
- We propose ELICIT, a novel framework that not only has the potential to harness models’ latent abilities without introducing substantial additional computational cost, but also advances language models’ performance, versatility, adaptability, and scalability.
📖 Educations
- 2023/09 - now, Westlake University, School of Engineering.
- 2019/09 - 2023/06, Shanghai University, Computer Science and Technology.