McCourt School of Public Policy
Massive Data Institute
View of GU from Key Bridge
Linkage Seminar

AI and duplicate detection to leverage external data sources – David Beauchemin

David Beauchemin discussed how duplicate detection can be used with AI and NLP techniques to extract information from external data sources using text distance metrics (e.g. Jaro) and classification algorithms. He used a case study in insurance to demonstrate the proposed approach.

David Beauchemin: An actuary by training, David decided to continue his studies at the master’s level in computer science to become familiar with machine learning. His master’s degree in natural language processing (NLP) focused on external source extraction in an insurance business process. He is now pursuing his studies at the Ph.D. level, where he is interested in personalizing automatically generated content from insurance contracts.

Watch the zoom recording here.

Tagged

AI NLP