Natural Language Processing With Transformers(Revised Edition)

Authors:

Lewis Tunstall ,Leandro Von Werra ,Thomas Wolf

Type:Hardcover/ PaperBack / Loose Leaf
Condition: Used/New

In Stock: 2 Left

Shipment time

Expected shipping within 2 - 3 Days
Access to 35 Million+ Textbooks solutions Free
Ask Unlimited Questions from expert AI-Powered Answers 30 Min Free Tutoring Session
7 days-trial

Total Price:

$0

List Price: $14.75 Savings: $14.75 (100%)
Access to 30 Million+ solutions
Ask 50 Questions from expert AI-Powered Answers 24/7 Tutor Help Detailed solutions for Natural Language Processing With Transformers

Price:

$9.99

/month

Book details

ISBN: B09S3X8T2C, 978-1098136796

Book publisher: O'Reilly Media

Offer Just for You!: Buy 2 books before the end of January and enter our lucky draw.

Book Price $0 : Since Their Introduction In 2017, Transformers Have Quickly Become The Dominant Architecture For Achieving State-of-the-art Results On A Variety Of Natural Language Processing Tasks. If You're A Data Scientist Or Coder, This Practical Book -now Revised In Full Color- Shows You How To Train And Scale These Large Models Using Hugging Face Transformers, A Python-based Deep Learning Library.Transformers Have Been Used To Write Realistic News Stories, Improve Google Search Queries, And Even Create Chatbots That Tell Corny Jokes. In This Guide, Authors Lewis Tunstall, Leandro Von Werra, And Thomas Wolf, Among The Creators Of Hugging Face Transformers, Use A Hands-on Approach To Teach You How Transformers Work And How To Integrate Them In Your Applications. You'll Quickly Learn A Variety Of Tasks They Can Help You Solve.Build, Debug, And Optimize Transformer Models For Core NLP Tasks, Such As Text Classification, Named Entity Recognition, And Question AnsweringLearn How Transformers Can Be Used For Cross-lingual Transfer LearningApply Transformers In Real-world Scenarios Where Labeled Data Is ScarceMake Transformer Models Efficient For Deployment Using Techniques Such As Distillation, Pruning, And QuantizationTrain Transformers From Scratch And Learn How To Scale To Multiple GPUs And Distributed Environments