Jad is a research scientist at CCC. He received his PhD in Computer Science in May 2022 from McGill University and the Montreal Institute for Learning Algorithms (Mila). Before that, he received his Masters from McGill University in 2014 and Bachelors from the American University of Beirut in 2011. His PhD research was in the broad area of Natural Language Processing, specifically, at the intersection of computational pragmatics and natural language generation and natural language understanding.
In his PhD, Jad worked on the computational modeling of presuppositions in natural language. Presuppositions are shared assumptions and facts that are not explicitly stated in the context (either in texts or conversations) and are taken for granted. For example, if we say in a conversation “Roger Federer won the match”, we presuppose he played a match (which we won) but that fact is not explicitly stated. In his PhD, Jad presented various neural models for learning presupposition effects in language (e.g., definite descriptions, adverbial presuppositions) and showed how we can use such models to improve the quality of extractive summaries. He also investigated large transformer-based models (e.g., BERT, RoBERTa) in the context of NLI to understand how well they perform on hard cases of presupposition as well as presenting learning frameworks to help improve their performance on such hard cases. His work was recognized with the ACL 2018 Best Paper Award and COLING 2022 Best Short Paper Award.
When not at CCC, you can probably find him at the MIT Zesiger Center lifting weights! Jad is also a big fan of The Office and thinks it is the best show ever (yes, better than Friends!).