An investigation of neural embeddings for coreference resolution

    Research output: Chapter in Book/Conference paperConference paperpeer-review

    1 Citation (Scopus)

    Abstract

    © Springer International Publishing Switzerland 2015. Coreference Resolution is an important task in Natural Language Processing (NLP) and involves finding all the phrases in a document that refer to the same entity in the real world, with applications in question answering and document summarisation. Work from deep learning has led to the training of neural embeddings of words and sentences from unlabelled text. Word embeddings have been shown to capture syntactic and semantic properties of the words and have been used in POS tagging and NER tagging to achieve state of the art performance. Therefore, the key contribution of this paper is to investigate whether neural embeddings can be leveraged to overcome challenges associated with the scarcity of coreference resolution labelled datasets for benchmarking. We show, as a preliminary result, that neural embeddings improve the performance of a coreference resolver when compared to a baseline.
    Original languageEnglish
    Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Place of PublicationBerlin, Germany
    PublisherSpringer-Verlag London Ltd.
    Pages241-251
    Volume9041
    ISBN (Print)03029743
    DOIs
    Publication statusPublished - 2015
    EventAn investigation of neural embeddings for coreference resolution - Cairo, Egypt
    Duration: 1 Jan 2015 → …

    Conference

    ConferenceAn investigation of neural embeddings for coreference resolution
    Period1/01/15 → …

    Fingerprint

    Dive into the research topics of 'An investigation of neural embeddings for coreference resolution'. Together they form a unique fingerprint.

    Cite this