🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Apr 18, 2025 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Google AI 2018 BERT pytorch implementation
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Minimal keyword extraction with BERT
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Chinese-LLaMA 1&2、Chinese-Falcon 基础模型;ChatFlow中文对话模型;中文OpenLLaMA模型;NLP预训练/指令微调数据集
Top2Vec learns jointly embedded topic, document and word vectors.
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
RoBERTa中文预训练模型: RoBERTa for Chinese
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."