In this week’s lab meeting, Vivian will talk about using transition-based feed-forward neural networks for sequence tagging (part-of-speech tagging, chunking, and named entity recognition). While existing LSTM networks have produced impressive results on such tasks, simple feed-forward neural networks can achieve comparable results. Finding a balance of speed and accuracy is an interest of this project.
In the lab meeting today, Andrei will be talking about the Skip-Thoughts, SDAE, and FastSent models for sentence vector embeddings as well as their applications to paraphrase detection on the MSRP and Quora datasets.
In the next lab meeting, Logan will be talking about grammar lexicalization. Here is a brief description of his talk:
I’ll introduce Greibach normal form (2GNF) for CFGs, and discuss some applications for prefix lexicalized grammars. I’ll then demonstrate that 2GNF cannot be used to lexicalize certain synchronous grammars, as the lexicalization algorithm would destroy some of the alignments in the original grammar. Finally, I will briefly introduce a different grammar formalism which may be able to retain more alignments when lexicalized. Time permitting, I’ll close by sharing some of the properties of these grammars which Anoop and I are investigating.
Today in the lab meeting, Dan Fass will be leading the discussion. The title of his talk is “A Selected Review of N-gram Research and Two Proposals”. Here is a brief description of his talk:
N-grams are widely used in NLP applications such as statistical machine translation, speech recognition, optical character recognition, and spelling correction. Three research groups are notable for developing n-grams that combine lexical and syntactic information of various types. A representation for n-gram information is proposed that attempts to reconcile differences between those n-grams, along with a classification underpinning the proposed representation and others. Potential applications and extensions of the representation are briefly described.
Today in the lab meeting, Golnar will give a talk about Named Entity recognition with LSTM-X. Here is a brief description of her talk: “We will be discussing variations of LSTM tailored for Named Entity recognition and plans for integrating them into GraphNER.”