Please use this identifier to cite or link to this item: http://hdl.handle.net/123456789/454
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGoldberg, Yoav-
dc.date.accessioned2026-02-10T06:12:27Z-
dc.date.available2026-02-10T06:12:27Z-
dc.date.issued2017-
dc.identifier.issn9781627052955-
dc.identifier.urihttp://hdl.handle.net/123456789/454-
dc.description.abstractNeural networks are a family of powerful machine learning models. is book focuses on the application of neural network models to natural language data. e first half of the book (Parts I and II) covers the basics of supervised machine learning and feed-forward neural networks, the basics of working with machine learning over language data, and the use of vector-based rather than symbolic representations for words. It also covers the computation-graph abstraction, which allows to easily define and train arbitrary neural networks, and is the basis behind the design of contemporary neural network software libraries. e second part of the book (Parts III and IV ) introduces more specialized neural net work architectures, including 1D convolutional neural networks, recurrent neural networks, conditioned-generation models, and attention-based models. ese architectures and techniques are the driving force behind state-of-the-art algorithms for machine translation, syntactic parsing, and many other applications. Finally, we also discuss tree-shaped networks, structured prediction, and the prospects of multi-task learning.en_US
dc.language.isoenen_US
dc.publisherMorgan & Claypoolen_US
dc.subjectnatural language processing, machine learning, supervised learning, deep learning, neural networks, word embeddings, recurrent neural networks, sequence to sequence modelsen_US
dc.titleNeural Network Methods for Natural Language Processingen_US
dc.typeBooken_US
Appears in Collections:E-Books

Files in This Item:
File Description SizeFormat 
Neural Network Methods (1).pdf2.06 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.