|
|
Getting Started with Hugging Face Transformers in Python
Author: Venkata Sudhakar
Hugging Face Transformers is an open-source Python library that provides thousands of pre-trained models for natural language processing, computer vision, and audio tasks. It is built on top of PyTorch and TensorFlow and offers a simple, high-level API called the pipeline that lets developers run inference with just a few lines of code. At ShopMax India, the data science team uses Hugging Face models for tasks like product review sentiment analysis and customer query classification. The Transformers library can be installed using pip. It requires Python 3.8 or higher and either PyTorch or TensorFlow as the backend. The example below shows how to install both the library and PyTorch together. The below example shows how to install Hugging Face Transformers and run a text classification pipeline.
Once installed, you can use the pipeline function to load a model and run inference. The below example shows how to classify the sentiment of a product review.
It gives the following output,
Review: This laptop exceeded all my expectations...
Sentiment: POSITIVE, Confidence: 1.00
Review: Battery drains too fast, very disappoint...
Sentiment: NEGATIVE, Confidence: 1.00
Review: Decent value for money, works as expected...
Sentiment: POSITIVE, Confidence: 0.96
The pipeline function automatically downloads the model weights on first use and caches them locally. By default, the sentiment-analysis pipeline uses the distilbert-base-uncased-finetuned-sst-2-english model, which is a lightweight model fine-tuned for binary sentiment classification. The score field represents the model confidence for the predicted label. In production, ShopMax India can use this to automatically flag low-confidence predictions for human review.
|
|