Peng Qi

齐鹏

(pinyin: /qí péng/; ipa: /tɕʰǐ pʰə̌ŋ/)

I am a researcher at JD AI Research working on natural language processing and machine learning.

My research is driven by the goal of bringing the world’s knowledge to the user’s assistance, which manifests itself in two main directions

  • How to effectively organize and use knowledge. This involves tasks like question answering (where I have co-lead the development of some benchmarks for complex reasoning: HotpotQA and BeerQA), information extraction, syntactic analysis for many languages (check out Stanza, my go-to procrastination project), etc.
  • How to effectively communicate knowledge. This mainly concerns interactive NLP systems such as conversational systems, where I am interested in theory-of-mind reasoning under information asymmetry, offline-to-online transfer, multi-modal interactions, etc.

In these tasks, I am also excited to explore data-efficient models and training techniques, model explainability, and self-supervised learning techniques that enable us to address these problems.

Before joining JD, I obtained my Ph.D. in Computer Science at Stanford University advised by Prof. Chris Manning, where I was a member of the NLP group. I also obtained two Master’s at Stanford (CS & Statistics), and my Bachelor’s at Tsinghua University.

[CV (slightly outdated)]

selected publications

(*=equal contribution)

  1. EMNLP
    Answering Open-Domain Questions of Varying Reasoning Steps from Text
    Peng Qi*, Haejun Lee*, Oghenetegiri "TG" Sido*, and Christopher D. Manning
    In Empirical Methods for Natural Language Processing (EMNLP), 2021.
  2. AKBC
    Open Temporal Relation Extraction for Question Answering
    Chao Shang, Peng Qi, Guangtao Wang, Jing Huang, Youzheng Wu, and Bowen Zhou
    In 3rd Conference on Automated Knowledge Base Construction, 2021.
  3. arXiv
    Conversational AI Systems for Social Good: Opportunities and Challenges
    Peng Qi, Jing Huang, Youzheng Wu, Xiaodong He, and Bowen Zhou
    arXiv preprint arXiv:2105.06457, 2021.
  4. NAACL
    Graph Ensemble Learning over Multiple Dependency Trees for Aspect-level Sentiment Classification
    Xiaochen Hou, Peng Qi, Guangtao Wang, Rex Ying, Jing Huang, Xiaodong He, and Bowen Zhou
    In 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2021.
  5. EACL
    Do Syntax Trees Help Pre-trained Transformers Extract Information?
    Devendra Singh Sachan, Yuhao Zhang, Peng Qi, and William Hamilton
    In The 16th Conference of the European Chapter of the Association for Computational Linguistics (EACL), 2021.
  6. ACL (Demo)
    Stanza: A Python Natural Language Processing Toolkit for Many Human Languages
    Peng Qi*, Yuhao Zhang*, Yuhui Zhang, Jason Bolton, and Christopher D. Manning
    In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, 2020.
  7. Findings
    Stay Hungry, Stay Focused: Generating Informative and Specific Questions in Information-Seeking Conversations
    Peng Qi, Yuhao Zhang, and Christopher D. Manning
    In Findings of the Association for Computational Linguistics: EMNLP 2020, 2020.
  8. EMNLP
    HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering
    Zhilin Yang*, Peng Qi*, Saizheng Zhang*, Yoshua Bengio, William W. Cohen, Ruslan Salakhutdinov, and Christopher D. Manning
    In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018.
  9. EMNLP
    Graph Convolution over Pruned Dependency Trees Improves Relation Extraction
    Yuhao Zhang*, Peng Qi*, and Christopher D. Manning
    In Empirical Methods in Natural Language Processing (EMNLP), 2018.