Publications
Clustering-Oriented Representation Learning with Attractive-Repulsive Loss. Kian Kenyon-Dean, Andre Cianflone, Lucas Page-Caccia, Guillaume Rabusseau, Jackie Chi Kit Cheung, Doina Precup. AAAI 2019 Workshop on Network Interpretability for Deep Learning, Honolulu, Hawaii, January 28, 2019. Link to the paper and conference/workshop website.
Clusterable Latent Spaces in Neural Networks. Kian Kenyon-Dean, Jackie Cheung, Doina Precup. 2018 Montreal AI Symposium, Montreal, August 28, 2018. See my poster here, and list of accepted papers here.
Resolving Event Coreference with Supervised Representation Learning and Clustering-Oriented Regularization. Kian Kenyon-Dean, Jackie Cheung, Doina Precup. Proceedings of the 7th Joint Conference on Lexical and Computational Semantics (*SEM 2018), New Orleans, Louisiana, June 5-6, 2018. Link to paper; for more details, see conference website.
Sentiment Analysis: It's Complicated! Kian Kenyon-Dean*, Eisha Ahmed*, Scott Fujimoto, Jeremy Georges-Filteau, Christopher Glasz, Barleen Kaur, Auguste Lalande, Shruti Bhanderi, Robert Belfer, Nirmal Kanagasabai, Roman Sarrazingendron, Rohit Verma and Derek Ruths. Proceedings of the 2018 conference of the North American Association for Computational Linguistics, New Orleans, Louisiana, June 1-6, 2018. Link to paper here. (* equal contribution)
Verb Phrase Ellipsis Resolution Using Discriminative and Margin-Infused Algorithms. Kian Kenyon-Dean, Jackie Cheung, and Doina Precup. Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pages 1734–1743, Austin, Texas, November 1-5, 2016. Link to paper.
Current Research Interests
I am currently engaged in research on the relationship between matrix factorization and language modelling in order to produce word embeddings. In general, I am interested in how different loss functions can impose different structures in latent feature spaces; this interest is reflected in much of my research, including my recently accepted workshop paper on clusterable latent spaces.