Tiny Bert Domain Advertising Classifier

https://huggingface.co/ansi-code/bert-domain-advertising-classifier/blob/main/bert_domain_advertising_classifier.ipynb

Overview

AdTargetingBERTClassifier is a small-scale BERT-based classifier designed for the task of ad targeting classification. The model is trained to predict multi-class labels associated with domains, as provided in the DAC693K dataset.

Model Architecture

The classifier is built on the BERT (Bidirectional Encoder Representations from Transformers) architecture. It takes domain text as input and outputs logits for each class, enabling multi-class classification for ad targeting.

Model Training

The model is trained on the "AdTargetingDataset" using a supervised learning approach. The training involves optimizing for the categorical cross-entropy loss, and the model is fine-tuned on the specific ad targeting classes associated with each domain.

Usage

Loading the Model

To use the trained classifier in your Python environment, you can load it using the following code:

from transformers import BertTokenizer, BertForSequenceClassification
import torch

# Load the pre-trained model and tokenizer
model = BertForSequenceClassification.from_pretrained("ansi-code/bert-domain-advertising-classifier")
tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")

# Example inference
text = "google.com"
inputs = tokenizer(text, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits

Prediction

To make predictions with the loaded model, you can use the obtained logits. Convert the logits to probabilities and determine the predicted class based on the highest probability.

Copy code
probabilities = torch.nn.functional.sigmoid(logits, dim=-1)
predicted_class = torch.argmax(probabilities).item()

Model Evaluation

The model's performance can be assessed using standard evaluation metrics such as accuracy, precision, recall, and F1-score on a separate validation set or through cross-validation.

License

This model is released under the Apache 2.0 License.

Citation

If you use this model in your work, please cite it using the following BibTeX entry:

@model{silvi_2023_bert-domain-advertising-classifier,
  title = {bert-domain-advertising-classifier},
  author = {Andrea Silvi},
  year = {2023},
}

Acknowledgements

We would like to thank the developers of the Hugging Face Transformers library for providing the BERT model implementation.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support