Lunch at 12:30pm, (virtual) talk at 1pm, in 148 Fitzpatrick

Title: Deep Learning On Graphs for Natural Language Processing

Abstract: Due to its great power in modeling non-Euclidean data like graphs or manifolds, deep learning on graph techniques (i.e., Graph Neural Networks (GNNs)) have opened a new door to solving challenging graph-related NLP problems. There has seen a surge of interest in applying deep learning on graph techniques to NLP, and has achieved considerable success in many NLP tasks, ranging from classification tasks like sentence classification, semantic role labeling and relation extraction, to generation tasks like machine translation, question generation and summarization. Despite these successes, deep learning on graphs for NLP still face many challenges, including automatically transforming original text sequence data into highly graph-structured data, and effectively modeling complex data that involves mapping between graph-based inputs and other highly structured output data such as sequences, trees, and graph data with multi-types in both nodes and edges. In this talk, I will talk about relevant and interesting topics on applying deep learning on graph techniques to NLP, ranging from the foundations to the applications.

Bio: I am currently the Principal Scientist at JD.COM Silicon Valley Research Center, leading a team of 30+ machine learning/natural language processing/recommendation system scientists and software engineers to build next generation intelligent ecommerce systems for personalized and interactive online shopping experience in JD.COM. Previously, I was a research staff member at IBM Research and led a research team for developing novel Graph Neural Networks for various AI tasks, which leads to the #1 AI Challenge Project in IBM Research and multiple IBM Awards including Outstanding Technical Achievement Award.

My research interests lie at the intersection of Machine Learning (Deep Learning), Representation Learning, and Natural Language Processing, with a particular emphasis on the fast-growing subjects of Graph Neural Networks and its extensions on new application domains. I have published more than 90 top-ranked AI/ML/NLP conference and journal papers, including but not limited to NIPS, ICML, ICLR, KDD, ACL, EMNLP, NAACL, IJCAI, and AAAI. I am also a co-inventor of more than 40 filed US patents. Because of the commercial value of my patents, I received several invention achievement awards and was appointed as IBM Master Inventors, class of 2020. I was the recipients of the Best Paper Award and Best Student Paper Award of several conferences such as IEEE ICC’19, AAAI workshop on DLGMA’20 and KDD workshop on DLG’19. My research has been featured in numerous media outlets, including NatureNews, YahooNews, Venturebeat, TechTalks, SyncedReview, Leiphone, QbitAI, MIT News, IBM Research News, and SIAM News.