Please use this identifier to cite or link to this item: http://hdl.handle.net/123456789/454
Title: Neural Network Methods for Natural Language Processing
Authors: Goldberg, Yoav
Keywords: natural language processing, machine learning, supervised learning, deep learning, neural networks, word embeddings, recurrent neural networks, sequence to sequence models
Issue Date: 2017
Publisher: Morgan & Claypool
Abstract: Neural networks are a family of powerful machine learning models. is book focuses on the application of neural network models to natural language data. e first half of the book (Parts I and II) covers the basics of supervised machine learning and feed-forward neural networks, the basics of working with machine learning over language data, and the use of vector-based rather than symbolic representations for words. It also covers the computation-graph abstraction, which allows to easily define and train arbitrary neural networks, and is the basis behind the design of contemporary neural network software libraries. e second part of the book (Parts III and IV ) introduces more specialized neural net work architectures, including 1D convolutional neural networks, recurrent neural networks, conditioned-generation models, and attention-based models. ese architectures and techniques are the driving force behind state-of-the-art algorithms for machine translation, syntactic parsing, and many other applications. Finally, we also discuss tree-shaped networks, structured prediction, and the prospects of multi-task learning.
URI: http://hdl.handle.net/123456789/454
ISSN: 9781627052955
Appears in Collections:E-Books

Files in This Item:
File Description SizeFormat 
Neural Network Methods (1).pdf2.06 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.