Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

README.md 2.3 KB

You have to be logged in to leave a comment. Sign In

How to Use an AWS EC2 instance as Self-hosted Runner for Continuous Machine Learning.

DagsHub + Actions + EC2 + CML

Main Concepts Covered

After completing this repository, you will be able to understand the following concepts:

  • Provision an AWS EC2 and running the training of a BERT model with CML
  • Implement a Github actions pipeline using the previous instance.
  • Automatically log your models metrics with MLFlow.
  • Automatically save your training metadata on DVC for easy tracking.

MLOps Workflow

Getting started

1. Prerequisites

Platforms

AWS EC2 instance properties

A free tier is enough for this use case. Below are the properties of the EC2 instance used for the use case.

  • cloud type: t2.micro.
  • cloud-region: us-east-1a
Other platforms

Other ressources

  • Python 3.9.1
  • DVC 2.11
  • CML
  • You can find all the additional information in the requirements.txt file

Results On DagsHub

DagsHub provides the capabilities to use MLFlow and DVC while giving the choice of working on Github. The following results are the experiments from DagsHub, using MLFlow to track the model F1-Score, Precision and Recall.

MLFlow metrics before and after pull request.

The following graphics shows the performances of the models for different epochs.

Before the pull request:

  • Epoch = 1
  • F1 score = 0.89
  • Precision = 0.88
  • Recall = 0.85

After the pull request:

⚠️ Beware that the wall time for 1 epochs training is 2h45 minutes. So might expect the double for two epochs.

  • Epoch = 2
  • F1 score = 0.92
  • Precision = 0.90
  • Recall = 0.91

Metrics before and After pull request

Full Article On Medium

Read the full article on my medium and Follow me for more content.

Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...