Natural Language Processing: NLP With Transformers in Python

Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more

Natural Language Processing: NLP With Transformers in Python
Natural Language Processing With Transformers In Python

Natural Language Processing: NLP With Transformers in Python udemy course

Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more

What you'll learn:

  • Industry standard NLP using transformer models
  • Build full-stack question-answering transformer models
  • Perform sentiment analysis with transformers models in PyTorch and TensorFlow
  • Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)
  • Create fine-tuned transformers models for specialized use-cases
  • Measure performance of language models using advanced metrics like ROUGE
  • Vector building techniques like BM25 or dense passage retrievers (DPR)
  • An overview of recent developments in NLP
  • Understand attention and other key components of transformers
  • Learn about key transformers models such as BERT
  • Preprocess text data for NLP
  • Named entity recognition (NER) using spaCy and transformers
  • Fine-tune language classification models

Requirements:

  • Knowledge of Python
  • Experience in data science a plus
  • Experience in NLP a plus

Description:

Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.

In this course, we cover everything you need to get started with building cutting-edge performance NLP applications using transformer models like Google AI's BERT, or Facebook AI's DPR.

We cover several key NLP frameworks including:

  • HuggingFace's Transformers

  • TensorFlow 2

  • PyTorch

  • spaCy

  • NLTK

  • Flair

And learn how to apply transformers to some of the most popular NLP use-cases:

  • Language classification/sentiment analysis

  • Named entity recognition (NER)

  • Question and Answering

  • Similarity/comparative learning

Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.

All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:

  • History of NLP and where transformers come from

  • Common preprocessing techniques for NLP

  • The theory behind transformers

  • How to fine-tune transformers

We cover all this and more, I look forward to seeing you in the course!

Who this course is for:

Course Details:

  • 11.5 hours on-demand video
  • 1 article
  • Full lifetime access
  • Access on mobile and TV
  • Assignments
  • Certificate of completion

Natural Language Processing: NLP With Transformers in Python udemy free download

Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more

Demo Link: https://www.udemy.com/course/nlp-with-transformers/