This repository contains a pipeline for fine-tuning a BERT-LLM model on a classification task and serving the model using the Union AI workflow and inference platform.
The quickest way to setup and the run the tutorial notebook is often using a hosted notebook, like Google colab.
Or you can follow the steps below to setup the project locally.
Serverless is the easiest way to get started with Union. You can sign up for a free account and $30 of credit at Union Serverless. BYOC (Bring Your Own Cloud) is also available for more features and advanced users. Schedule a demo to learn more about Union BYOC.
- Union Serverless Sign-up: https://www.union.ai/
- Union BYOC: https://docs.union.ai/byoc/user-guide/union-overview#union-byoc
Read more in the overview of Union Serverless and Union BYOC.
git clone https://github.com/unionai-oss/bert-llm-classification-pipeline
cd bert-llm-classification-pipeline
pip install -r requirements.txt
After you have signed up for Union, you can authenticate to Union from the CLI.
If on Union Serverless
union create login --serverless --auth device-flow
If on Union BYOC (Bring Your Own Cloud)
union create login --host <union-host-url>
Now your environment is setup to run the project on remotely Union.
The tutorial notebook will guide you through the steps to fine-tune a BERT-LLM model on a classification task and serve the model using Union AI.
jupyter notebook tutorial.ipynb
Or you can run the steps for the training pipeline and serving from the CLI.
Train the model:
Serve the model:
Run Batch Inference:
Run Near Real-time Batch Inference with Actors: