Select language
< Return to main menu
yangzhi_fuben.jpg

Zhilin Yang

SQZ PI(July 2021 to present)

Biography

Shanghai Qi Zhi Research Institute PI.

Zhilin Yang focuses on achieving cognitive intelligence through natural language processing. He has published influential works such as Transformer-XL and XLNet as the first author, greatly impacting the field of natural language processing. He has over 9600 citations on Google Scholar. He completed his undergraduate studies at Tsinghua University and earned his Ph.D. from Carnegie Mellon University in the United States.

Personal Honors:

Forbes China 30 Under 30 Elite

Forbes Asia 30 Under 30 Elite


Research Direction

Large-scale pretraining

Natural language processing

Few-shot learning / Zero-shot learning

Generative models

Sequence models

Multimodal learning


Paper/Publication

8. Jing Zhou, Zongyu Lin, Yanan Zheng, Jian Li, Zhilin Yang, Not All Tasks Are Born Equal: Understanding Zero-Shot Generalization, International Conference on Learning Representation (ICLR), 2023 查看PDF


7. Nan Shao, Zefan Cai, Chonghua Liao, Yanan Zheng, Zhilin Yang, Compositional task representations for large language models, International Conference on Learning Representation (ICLR), 2023 查看PDF


6. Haike Xu, Zongyu Lin, Jing Zhou, Yanan Zheng, Zhilin Yang, A Universal Discriminator for Zero-Shot Generalization, Annual Meeting of the Association for Computational Linguistics(ACL), 2023 查看PDF


5. Xingcheng Yao, Yanan Zheng, Xiaocong Yang, Zhilin Yang, NLP From Scratch Without Large-Scale Pretraining:A Simple and Efficient FrameworkInternational Conference on Machine Learning (ICML), 2023 查看PDF


4. Zhengxiao Du, Yujie Qian, Xiao Liu, Ming Ding, Jiezhong Qiu, Zhilin Yang, Jie Tang, GLM: General Language Model Pretraining with Autoregressive Blank Infilling, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF


3. Xiao Liu, Kaixuan Ji, Yicheng Fu, Weng Lam Tam, Zhengxiao Du, Zhilin Yang, Jie Tang, P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF


2. Yanan Zheng, Jing Zhou, Yujie Qian, Ming Ding, Chonghua Liao, Jian Li, Ruslan Salakhutdinov, Jie Tang, Sebastian Ruder, Zhilin Yang, FewNLU: Benchmarking State-of-the-Art Methods for Few-Shot Natural Language Understanding, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF


1. Jing Zhou, Yanan Zheng, Jie Tang, Jian Li, Zhilin Yang, Flipda: Effective and Robust Data Augmentation for Few-Shot Learning, 2022 Annual Meeting of the Association for Computational Linguistics(ACL), 2022 查看PDF