Peng Qi
Peng Qi
Research, Learn, Code, and Play.

Brief Bio

I am currently a Ph.D. student at the Computer Science Department of Stanford University. I work in the NLP group, and I am advised by Prof. Chris Manning.

I am interested in building machine learning models that understand natural languages in a similar fashion as humans do. Outside of NLP research, I am broadly interested in presenting data in a more understandable manner, making technology appear less boring (to students, for example), and processing data with more efficient computation. I have also worked on speech recognition and computer vision previously.

[CV (slightly outdated)]    [Publications]


Education & Professional Experience

2015.9 - PresentPh.D. Student & Research Assistant, Computer Science Department, Stanford University
2016.4 - 2017.3Master of Science, Department of Statistics, Stanford University
2013.9 - 2015.6Master of Science & Research Assistant, Computer Science Department, Stanford University
2012.7 - 2013.6Research Assistant, State Key Laboratory of Intelligent Technology & Systems, Department of Computer Science and Technology, Tsinghua University
2008.8 - 2012.7Bachelor of Engineering, School of Software, Tsinghua University (Excellent Graduate)

Selected Publications

(See all)

  • Yuhao Zhang*, Peng Qi*, and Christopher D. Manning. Graph Convolution over Pruned Dependency Trees Improves Relation Extraction. Preprint. [PDF]  
  • Urvashi Khandelwal, He He, Peng Qi, and Dan Jurafsky. Sharp Nearby, Fuzzy Far Away: How Neural Language Models Use Context. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (ACL), 2018. [PDF]   [BibTeX]   [Code]  
  • Peng Qi and Christopher D. Manning. Arc-swift: A Novel Transition System for Dependency Parsing. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL), 2017. [PDF]   [BibTeX]   [Code]  
  • Timothy Dozat, Peng Qi, and Christopher D. Manning. Stanford’s Graph-based Neural Dependency Parser at the CoNLL 2017 Shared Task. CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies. 1st place. [PDF]   [BibTeX]   [Code]  
  • Yuhao Zhang*, Arun Chaganty*, Ashwin Paranjape*, Danqi Chen*, Jason Bolton*, Peng Qi, and Christopher D. Manning. Stanford at TAC KBP 2016: Sealing Pipeline Leaks and Understanding Chinese. In Text Analysis Conference (TAC) Proceedings, 2016. [PDF]   [BibTeX]  
  • Andrew L. Maas, Peng Qi, Ziang Xie, Awni Y. Hannun, Christopher T. Lengerich, Daniel Jurafsky, and Andrew Y. Ng. Building DNN Acoustic Models for Large Vocabulary Speech Recognition. Computer Speech & Language, 2016. [PDF]   [BibTeX]  
  • Peng Qi and Xiaolin Hu. Learning Nonlinear Regularities in Natural Images by Modeling the Outer Product of Image Intensities. Neural Computation, 2013. [PDF]   [BibTeX]   [Code]  
  • Xiaolin Hu, Peng Qi, and Bo Zhang. Hierarchical K-Means Algorithm for Modeling Visual Area V2 Neurons. In Neural Information Processing, pp. 373–381, 2012. Best Paper Award. [PDF]   [BibTeX]  

Honors and Awards

  • 2012 Outstanding Graduate of Tsinghua University
  • 2011 National Scholarship (top 3% students)
  • 2010 Citibank Scholarship (for overall excellence)
  • 2009 Ge-Ru Zheng’s Scholarship (for study excellence)
  • 2008-2011 Freshman Scholarship (Ranked 3rd of Guizhou Province in National College Entrance Exam)


Last Updated: Jan. 21, 2015