A1-DeBERTaV3-Small

Model Description

A1-DeBERTaV3-Small is a hybrid model that combines the DeBERTa v3 small encoder with a transformer based sentiment classifier. The DeBERTa v3 component receives tokenized text and outputs last hidden states for each token. These embeddings are then passed to Waveform A1, which aggregates the token representations, applies multi-head self-attention and a position-wise feed-forward network, and finally produces joint predictions for both sentiment and market classification.

Intended Use Cases

  • Crypto Discussion Analysis: Automatic categorization of large volumes of messages into sentiment and market outlook.
  • Real-Time Monitoring: Scalable pipeline for near-real-time classification of crypto-related chatter.
  • Research & Development: A testbed for exploring semi-supervised or domain-specific language modeling strategies.

Example Usage

from transformers import AutoTokenizer
import onnxruntime
import numpy as np
import torch.nn.functional as F

def decode_sentiment(idx: int) -> str:
    sentiment_map = {0: 'positive', 1: 'neutral', 2: 'negative'}
    return sentiment_map[idx]

def decode_market(idx: int) -> str:
    market_map = {
        0: 'strong bullish',
        1: 'bullish',
        2: 'neutral',
        3: 'bearish',
        4: 'strong bearish'
    }
    return market_map[idx]

def softmax(x, axis=1):
    exp_x = np.exp(x - np.max(x, axis=axis, keepdims=True))
    return exp_x / np.sum(exp_x, axis=axis, keepdims=True)

tokenizer = AutoTokenizer.from_pretrained("microsoft/deberta-v3-small")

text = "input-text-goes-here"

inputs = tokenizer(
    text,
    return_tensors="pt",
    padding="max_length",
    truncation=True,
    max_length=512 
)
input_ids = inputs["input_ids"]
attention_mask = inputs["attention_mask"]

ort_inputs = {
    "input_ids": input_ids.cpu().numpy(),
    "attention_mask": attention_mask.cpu().numpy()
}

session = onnxruntime.InferenceSession("a1-debertav3.onnx")

sentiment_logits, market_logits = session.run(None, ort_inputs)

sentiment_probs = softmax(sentiment_logits, axis=1)
market_probs = softmax(market_logits, axis=1)

sentiment_pred = np.argmax(sentiment_probs, axis=1)
market_pred = np.argmax(market_probs, axis=1)

decoded_sentiment = decode_sentiment(sentiment_pred.item())
decoded_market = decode_market(market_pred.item())
print(f"Sentiment: {decoded_sentiment}")
print(f"Market: {decoded_market}")

Community

This model is actively maintained and open to community contributions via pull requests or collaboration inquiries.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for WaveformFinance/A1_DeBERTaV3

Quantized
(3)
this model