Code Server Use Case
This guide will walk you through fine-tuning a pre-trained BERT model on the GLUE MRPC task using a GPU-enabled Code Server container.
Step 1: Create a GPU Container using Code Server

Step 2: install python3, pip
Step 3: using virtual environment to install required python packages and run training code
Step 4: Install required python packages
Step 5: Clone Hugging Face transformers from Github

Step 6: Install python package
Step 7: Finetune BERT on GLUE MRPC. Your output will be stored at /tmp/bert-finetuned
In this step, you will fine-tune the pre-trained BERT model on the Microsoft Research Paraphrase Corpus (MRPC) task from the GLUE benchmark. This means the model will learn to classify whether two sentences are paraphrases (have the same meaning) or not.



Step 7: Create a file contains test script called test.py. Insert the code below.
Step 8: Run test.py to test the finetuned model

Last updated
