Transformers is an architecture of machine learning models that uses the attention mechanism to process data. Many models are based on this architecture, like GPT, BERT, T5, and Llama. A lot of these models are similar to each other. While you can build your own models in Python using PyTorch or TensorFlow, Hugging Face released […]
Archive | Hugging Face Transformers
Understanding the DistilBart Model and ROUGE Metric
DistilBart is a typical encoder-decoder model for NLP tasks. In this tutorial, you will learn how such a model is constructed and how you can check its architecture so that you can compare it with other models. You will also learn how to use the pretrained DistilBart model to generate summaries and how to control […]

Text Summarization with DistillBart Model
Text summarization represents a sophisticated evolution of text generation, requiring a deep understanding of content and context. With encoder-decoder transformer models like DistilBart, you can now create summaries that capture the essence of longer text while maintaining coherence and relevance. In this tutorial, you’ll discover how to implement text summarization using DistilBart. You’ll learn through […]

Text Generation using Contrastive Search with GPT-2 Model
Text generation is one of the most fascinating applications of deep learning. With the advent of large language models like GPT-2, we can now generate human-like text that’s coherent, contextually relevant, and surprisingly creative. In this tutorial, you’ll discover how to implement text generation using GPT-2. You’ll learn through hands-on examples that you can run […]

Auto-Completion Style Text Generation with GPT-2 Model
Generating gibberish text is a simple programming exercise for beginners. But completing a sentence meaningfully would require a lot of work. The landscape of auto-completion technology has transformed dramatically with the introduction of neural approaches. With Hugging Face’s transformers library, implementing text completion is only a few lines of code. In this comprehensive tutorial, you […]

How to Do Named Entity Recognition (NER) with a BERT Model
Named Entity Recognition (NER) is one of the fundamental building blocks of natural language understanding. When humans read text, we naturally identify and categorize named entities based on context and world knowledge. For instance, in the sentence “Microsoft’s CEO Satya Nadella spoke at a conference in Seattle,” we effortlessly recognize the organizational, personal, and geographical […]
A Complete Introduction to Using BERT Models
BERT model is one of the first Transformer application in natural language processing (NLP). Its architecture is simple, but sufficiently do its job in the tasks that it is intended to. In the following, we’ll explore BERT models from the ground up — understanding what they are, how they work, and most importantly, how to […]