Natural language processing with TensorFlow : teach language to machines using Python's deep learning library /

TensorFlow is the leading framework for deep learning algorithms critical to artificial intelligence, and natural language processing (NLP) makes much of the data used by deep learning applications accessible to them. This book brings the two together and teaches deep learning developers how to work...

Full description

Saved in:
Bibliographic Details
Main Author: Ganegedara, Thushan (Author)
Format: Electronic eBook
Language:English
Published: Birmingham, UK : Packt, [2018]
Subjects:
Online Access:CONNECT

MARC

LEADER 00000cam a2200000Ki 4500
001 mig00005560272
006 m o d
007 cr cnu---unuuu
008 180609s2018 enk ob 001 0 eng d
005 20240626140934.5
015 |a GBB8A8385  |2 bnb 
016 7 |a 018897090  |2 Uk 
019 |a 1175633515 
020 |a 9781788477758  |q (electronic bk.) 
020 |a 1788477758  |q (electronic bk.) 
020 |z 9781788478311 
035 |a (OCoLC)1039700926  |z (OCoLC)1175633515 
035 |a 1wrldshron1039700926 
037 |a 6C52D1D4-6E23-433E-B2ED-09E5671203F8  |b OverDrive, Inc.  |n http://www.overdrive.com 
040 |a EBLCP  |b eng  |e rda  |e pn  |c EBLCP  |d N$T  |d MERUC  |d OCLCF  |d IDB  |d NLE  |d TEFOD  |d OCLCQ  |d UKMGB  |d LVT  |d UKAHL  |d OCLCQ  |d UX1  |d K6U 
049 |a TXMM 
050 4 |a Q325.5 
082 0 4 |a 006.31  |2 23 
100 1 |a Ganegedara, Thushan,  |e author. 
245 1 0 |a Natural language processing with TensorFlow :  |b teach language to machines using Python's deep learning library /  |c Thushan Ganegedara. 
264 1 |a Birmingham, UK :  |b Packt,  |c [2018] 
264 4 |c ©2018 
300 |a 1 online resource (472 pages) 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
504 |a Includes bibliographical references and index. 
588 0 |a Print version record. 
505 0 |a Cover; Copyright; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Introduction to Natural Language Processing; What is Natural Language Processing?; Tasks of Natural Language Processing; The traditional approach to Natural Language Processing; Understanding the traditional approach; Example -- generating football game summaries; Drawbacks of the traditional approach; The deep learning approach to Natural Language Processing; History of deep learning; The current state of deep learning and NLP; Understanding a simple deep model -- a Fully Connected Neural Network. 
505 8 |a The roadmap -- beyond this chapterIntroduction to the technical tools; Description of the tools; Installing Python and scikit-learn; Installing Jupyter Notebook; Installing TensorFlow; Summary; Chapter 2: Understanding TensorFlow; What is TensorFlow?; Getting started with TensorFlow; TensorFlow client in detail; TensorFlow architecture -- what happens when you execute the client?; Cafe Le TensorFlow -- understanding TensorFlow with an analogy; Inputs, variables, outputs, and operations; Defining inputs in TensorFlow; Feeding data with Python code; Preloading and storing data as tensors. 
505 8 |a Building an input pipelineDefining variables in TensorFlow; Defining TensorFlow outputs; Defining TensorFlow operations; Comparison operations; Mathematical operations; Scatter and gather operations; Neural network-related operations; Reusing variables with scoping; Implementing our first neural network; Preparing the data; Defining the TensorFlow graph; Running the neural network; Summary; Chapter 3: Word2vec -- Learning Word Embeddings; What is a word representation or meaning?; Classical approaches to learning word representation. 
505 8 |a WordNet -- using an external lexical knowledge base for learning word representationsTour of WordNet; Problems with WordNet; One-hot encoded representation; The TF-IDF method; Co-occurrence matrix; Word2vec -- a neural network-based approach to learning word representation; Exercise: is queen = king -- he + she?; Designing a loss function for learning word embeddings; The skip-gram algorithm; From raw text to structured data; Learning the word embeddings with a neural network; Formulating a practical loss function; Efficiently approximating the loss function. 
505 8 |a Implementing skip-gram with TensorFlowThe Continuous Bag-of-Words algorithm; Implementing CBOW in TensorFlow; Summary; Chapter 4: Advanced Word2vec; The original skip-gram algorithm; Implementing the original skip-gram algorithm; Comparing the original skip-gram with the improved skip-gram; Comparing skip-gram with CBOW; Performance comparison; Which is the winner, skip-gram or CBOW?; Extensions to the word embeddings algorithms; Using the unigram distribution for negative sampling; Implementing unigram-based negative sampling; Subsampling -- probabilistically ignoring the common words. 
500 |a Implementing subsampling. 
520 |a TensorFlow is the leading framework for deep learning algorithms critical to artificial intelligence, and natural language processing (NLP) makes much of the data used by deep learning applications accessible to them. This book brings the two together and teaches deep learning developers how to work with today's vast amount of unstructured data. 
500 |a EBSCO eBook Academic Comprehensive Collection North America 
650 0 |a Machine learning. 
650 0 |a Artificial intelligence. 
650 0 |a Python (Computer program language) 
730 0 |a WORLDSHARE SUB RECORDS 
776 0 8 |i Print version:  |a Ganegedara, Thushan.  |t Natural Language Processing with TensorFlow : Teach language to machines using Python's deep learning library.  |d Birmingham : Packt Publishing, ©2018 
856 4 0 |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1823678&authtype=ip,sso&custid=s4672406  |z CONNECT  |3 EBSCO  |t 0 
907 |a 4856450  |b 05-27-21  |c 09-30-20 
994 |a 92  |b TXM 
998 |a wi  |b 05-27-21  |c m  |d z   |e -  |f eng  |g enk  |h 0  |i 2 
999 f f |i 44318c1d-12fd-4c63-88a1-5e3469f01a38  |s 4d559ef9-31fc-47a5-8952-f3927326bdb7  |t 0 
952 f f |a Middle Tennessee State University  |b Main  |c James E. Walker Library  |d Electronic Resources  |t 0  |e Q325.5   |h Library of Congress classification