News

Andrei's Talk on Fast Dependency Parsing
13 Sep 2017

In the lab meeting on September 14 (Friday), Andrei will talk about his thesis work on fast dependency parsing using neural networks and transition-based parsing systems.

Vivian Kou MSc Thesis Defence
12 Sep 2017

On September 12th at 2pm in TASC1 9204 West, Vivian Kou will defend her MSc thesis on the topic of “Speed versus Accuracy in Neural Sequence Tagging for Natural Language Processing”.

Abstract:

Sequence Tagging, including part of speech tagging and named entity recognition, is an important task in NLP. Recurrent neural network models such as Bidirectional LSTMs have produced impressive results on sequence tagging. In this work, we first present a simple and fast greedy sequence tagging system using different types of feedforward neural net- work models. Then we show the speed and accuracy comparison between Bidirectional LSTMs and feedforward models. Besides the feedforward and the Bidirectional LSTM models, we propose two new models based on Mention2Vec by Stratos (2016): Feedforward-Mention2Vec for Named Entity Recognition and BPE-Mention2Vec for Part-of-Speech Tagging. Feedforward-Mention2Vec predicts named entity boundaries first and then predicts types of named entities. BPE-Mention2Vec uses the Byte Pair Encoding algorithm to segment words in a sequence first and then predicts the Part-of-Speech tags for the subword spans. We carefully design the experiments to demonstrate the speed and accuracy trade- off in different models. The empirical results reveal that feedforward models can achieve comparable accuracy and faster speed than recurrent models for Part-of-Speech tagging, and Feedforward-Mention2Vec is competitive with the fully structured BiLSTM model for Named Entity Recognition while being more scalable in the number of named entity types.

M.Sc. Examining Committee:

  • Dr. Anoop Sarkar, Senior Supervisor
  • Dr. Fred Popowich, Supervisor
  • Dr. Jiannan Wang, Internal Examiner
  • Dr. Arrvindh Shriraman, Chair

Vivian's Talk on Transition-Based Feed-Forward Neural Networks for Sequence Tagging
10 Apr 2017

In this week’s lab meeting, Vivian will talk about using transition-based feed-forward neural networks for sequence tagging (part-of-speech tagging, chunking, and named entity recognition). While existing LSTM networks have produced impressive results on such tasks, simple feed-forward neural networks can achieve comparable results. Finding a balance of speed and accuracy is an interest of this project.

Andrei's Presentation on Sentence Embeddings for Paraphrase Detection
03 Apr 2017

In the lab meeting today, Andrei will be talking about the Skip-Thoughts, SDAE, and FastSent models for sentence vector embeddings as well as their applications to paraphrase detection on the MSRP and Quora datasets.

Logan's Talk on Grammar Lexicalization
27 Mar 2017

In the next lab meeting, Logan will be talking about grammar lexicalization. Here is a brief description of his talk:

I’ll introduce Greibach normal form (2GNF) for CFGs, and discuss some applications for prefix lexicalized grammars. I’ll then demonstrate that 2GNF cannot be used to lexicalize certain synchronous grammars, as the lexicalization algorithm would destroy some of the alignments in the original grammar. Finally, I will briefly introduce a different grammar formalism which may be able to retain more alignments when lexicalized. Time permitting, I’ll close by sharing some of the properties of these grammars which Anoop and I are investigating.

Recent Publications