Emotions Classifier - Deep MLP

Model klasyfikacji emocji w tekście oparty na wielowarstwowym perceptronie (MLP) z regularyzacją.

Model Details

  • Architektura: Deep MLP (5 warstw ukrytych)
  • Input: TF-IDF vectors (5000 features)
  • Output: 6 klas emocji (sadness, joy, love, anger, fear, surprise)
  • Accuracy: 88.69% na zbiorze testowym
  • Overfitting Gap: 6.02%

Architektura:

Input (5000) 
  → Linear(1024) + BatchNorm + ReLU + Dropout(0.5)
  → Linear(512) + BatchNorm + ReLU + Dropout(0.4)
  → Linear(256) + BatchNorm + ReLU + Dropout(0.3)
  → Linear(128) + BatchNorm + ReLU + Dropout(0.2)
  → Linear(6)

Techniki regularyzacji:

  • Progresywny Dropout: 0.5 → 0.4 → 0.3 → 0.2
  • Batch Normalization po każdej warstwie
  • L2 Regularization (weight decay = 1e-4)

Usage

import torch
import pickle
from sklearn.feature_extraction.text import TfidfVectorizer

# Wczytaj model
model = DeepMLP()
model.load_state_dict(torch.load("model.pth", map_location="cpu"))
model.eval()

# Wczytaj vectorizer i emotion_map
with open("vectorizer.pkl", "rb") as f:
    vectorizer = pickle.load(f)
with open("emotion_map.pkl", "rb") as f:
    emotion_map = pickle.load(f)

# Klasyfikacja
text = "I am so happy today!"
X = vectorizer.transform([text]).toarray()
X_tensor = torch.FloatTensor(X)

with torch.no_grad():
    outputs = model(X_tensor)
    _, predicted = torch.max(outputs, 1)
    emotion = emotion_map[predicted.item()]

print(f"Emotion: {emotion}")

Training Details

  • Dataset: Emotions Dataset (Kaggle) - 150,000 examples
  • Train/Val/Test split: 70% / 15% / 15%
  • Optimizer: Adam (lr=0.001, weight_decay=1e-4)
  • Batch size: 256
  • Epochs: 10
  • Loss function: CrossEntropyLoss

Performance

Emotion Precision Recall F1-Score
sadness 0.95 0.90 0.92
joy 0.89 0.93 0.91
love 0.81 0.73 0.77
anger 0.88 0.91 0.89
fear 0.83 0.86 0.84
surprise 0.72 0.77 0.74
Macro avg 0.85 0.85 0.85

Citation

@misc{emotions-classifier-mlp,
  author = {Hubert Brzozowski},
  title = {Emotions Classifier - Deep MLP},
  year = {2026},
  publisher = {Hugging Face}
}

License

MIT License

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support