Guides
Log In
Guides

Non-Generative Models

BERT

BERT (Bidirectional Encoder Representations from Transformers) is used for a wide variety of general-purpose language tasks. If you need a versatile model for various NLP tasks, BERT is a great choice.

Example use cases: sentiment analysis, question-answering, text classification, named entity recognition (NER)

Example applications: customer feedback analysis, automated chatbot responses, location detection

You can find example code that leverages our BERT models for sentiment analysis here.

SBERT

SBERT (Sentence-BERT) modifies the BERT architecture and is used to compare the meaning of sentences and texts, making them ideal for sentence-level tasks. If you need to find documents or passages with similar meanings in a large corpus, SBERT is a powerful tool.

Example use cases: sentence/document analysis, semantic similarity, information retrieval

Example applications: document clustering, contract analysis, recommendation systems, plagiarism detection

You can find example code that leverages our SBERT models for sentence similarity here.