Dokumentation (english)

BERT Base

Bidirectional transformer for text classification tasks

BERT Base Uncased is a bidirectional transformer pre-trained on large text corpora. At inference it takes a text string and returns predicted class labels and their probabilities. Requires a fine-tuned checkpoint.

When to use:

  • Sentiment analysis (positive / negative / neutral)
  • Topic or intent classification
  • Spam detection or content moderation

Input: Text string to classify + optional fine-tuned checkpoint Output: Predicted class labels (CSV) and class probabilities per prediction

Model Settings

Max Seq Length (default: 512) Maximum number of tokens the model processes. Texts longer than this are truncated.

  • 128: Fast, suitable for short sentences
  • 256: Good for paragraphs
  • 512: Full document understanding (default, slower)

Inference Settings

No dedicated inference-time settings. The model classifies text deterministically using the loaded checkpoint.


Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor etwa 4 Stunden
Release: v4.0.0-production
Buildnummer: master@afa25ab
Historie: 72 Items