I am a researcher at JD AI Research working on natural language processing and machine learning. If you are interested in exploring research opportunities with us (internship or full-time), don’t hesitate to reach out!
My research goal is to build explainable machine learning systems to help us solve problems efficiently using textual knowledge. I believe that AI systems should be able to explain their computational decisions in a human-understandable manner, so as to build trust in their application to real-world problems. To this end, I have recently been working on natural language processing (NLP) techniques that help us answer complex questions from textual knowledge through explainable multi-step reasoning, as well as models that reason pragmatically about the knowledge of their interlocutors for efficient communication in dialogues, among others.
Outside of NLP research, I am broadly interested in presenting data in a more understandable manner, making technology appear less boring (to students, for example), and processing data with more efficient computation. I have also worked on speech recognition and computer vision previously.
When I procrastinate in my research life, I write code for Stanza, a natural language processing toolkit that’s available for a few dozen (human) languages, written in Python.
education & professional experience
- RepL4NLPEntity and Evidence Guided Document-Level Relation ExtractionIn 6th Workshop on Representation Learning for NLP (RepL4NLP) at ACL 2021, 2021.
- TextGraphsSelective Attention Based Graph Convolutional Networks for Aspect-Level Sentiment ClassificationIn TextGraphs-15 at NAACL 2021, 2021.
- arXivConversational AI Systems for Social Good: Opportunities and ChallengesarXiv preprint arXiv:2105.06457, 2021.
- JAMIABiomedical and Clinical English Model Packages for the Stanza Python NLP LibraryJournal of the American Medical Informatics Association, 2021.
- NAACLGraph Ensemble Learning over Multiple Dependency Trees for Aspect-level Sentiment ClassificationIn 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2021.
- EACLDo Syntax Trees Help Pre-trained Transformers Extract Information?In The 16th Conference of the European Chapter of the Association for Computational Linguistics (EACL), 2021.
- arXivRetrieve, Read, Rerank, then Iterate: Answering Open-Domain Questions of Varying Reasoning Steps from TextarXiv preprint arXiv:2010.12527, 2020.
- PhD thesisExplainable and Efficient Knowledge Acquisition from TextStanford University, 2020.
- ACL (Demo)Stanza: A Python Natural Language Processing Toolkit for Many Human LanguagesIn Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, 2020.
- FindingsStay Hungry, Stay Focused: Generating Informative and Specific Questions in Information-Seeking ConversationsIn Findings of the Association for Computational Linguistics: EMNLP 2020, 2020.
- AlexaPrizeNeural Generation Meets Real People: Towards Emotionally Engaging Mixed-Initiative ConversationsIn 3rd Proceedings of Alexa Prize (Alexa Prize 2019), 2020.
- EMNLP-IJCNLPAnswering Complex Open-domain Questions Through Iterative Query GenerationIn 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019.
- ACLSharp Nearby, Fuzzy Far Away: How Neural Language Models Use ContextAssociation for Computational Linguistics (ACL), 2018.
- EMNLPHotpotQA: A Dataset for Diverse, Explainable Multi-hop Question AnsweringIn Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018.
- EMNLPGraph Convolution over Pruned Dependency Trees Improves Relation ExtractionIn Empirical Methods in Natural Language Processing (EMNLP), 2018.