Leveraging Distant Supervision for Word Representations
Abbas Ghaddar (abbassgh_1993 <at> hotmail (point) com
)
RALI, DIRO
Le 10 avril 2018 à 9 h 30 — !!! date inhabituelle ET HEURE !!!
Salle 3195, Pavillon André-Aisenstadt
The small amount of annotated data in Natural Language Processing (NLP) have tied the learning of high quality word representations to unsupervised tasks like context prediction and language modeling. The main reason behind this is the absence of largescale manually labelled data for most NLP tasks. Distant supervision techniques are very promising as they can be used to overcome the lack of large-scale labelled data in NLP applications. Although models trained on distant supervision data improve general domain performances, they perform poorly in domain-specific evaluations. Therefor, we propose to leverage distant supervision in order to learn word representations that can be used latter as features. We argue that, if massive amount of labelled data is available, good representations could be learn on downstream supervised tasks (like entity typing and relation extraction). To the best of our knowledge, this exact subject has not been yet explored, consequently we will tackle it in our research. While analyzing future direction of our research, we identified two main tasks : (1) Generate large-scale of labelled data following distant supervision approaches. Our main objective is to gather as much as possible of annotations, while introducing as less or as possible the minimum portion of noise. (2) Leveraging the massive amount of distant supervision data to learn word representations via supervised tasks. We will focus on three downstream tasks : Named Entity Recognition, Fine Grained Entity Typing and Relation Extraction. Our preliminary experimental results show that representations learnt with distant supervision data are complementary to those by other methods.
Pour recevoir les annonces hebdomadaires par courriel, visitez http://rali.iro.umontreal.ca/rali/?q=fr/node/1631