Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
Zoumana Keita fbd944d09a
Update README.md
1 year ago
38d7ca318f
model and data for DVC tracking
1 year ago
5f1466380f
Add files via upload
1 year ago
65c2a9eb58
Add files via upload
1 year ago
107585a348
Adding project files
1 year ago
38d7ca318f
model and data for DVC tracking
1 year ago
107585a348
Adding project files
1 year ago
fbd944d09a
Update README.md
1 year ago
107585a348
Adding project files
1 year ago
93a87f0005
Adding model and data
1 year ago
107585a348
Adding project files
1 year ago
107585a348
Adding project files
1 year ago
93a87f0005
Adding model and data
1 year ago
107585a348
Adding project files
1 year ago
107585a348
Adding project files
1 year ago
1b1b67bbc6
Adding the requirements file
1 year ago
107585a348
Adding project files
1 year ago
107585a348
Adding project files
1 year ago
107585a348
Adding project files
1 year ago
107585a348
Adding project files
1 year ago
Storage Buckets
Data Pipeline
Legend
DVC Managed File
Git Managed File
Metric
Stage File
External File

README.md

You have to be logged in to leave a comment. Sign In

How to improve your model training with DagsHub Direct Data Access on EC2 instance.

DagsHub + Actions + EC2 + CML

Main Concepts Covered

After completing this repository, you will be able to understand the following concepts:

  • Provision an AWS EC2 and running the training of a BERT model with CML
  • Apply DagsHub Direct Data Access to improve your training process.
  • Implement a Github actions pipeline using the previous instance.
  • Automatically log your models metrics with MLFlow.
  • Compare the model performance with MLflow Experiment.
  • Automatically save your training metadata on DVC for easy tracking.

Workflow Comparison: Streaming Vs. Regular Approach

Regular Approach

This approach can take significant amount of time to get the data from pull instruction.

Regular Workflow

Streaming Approach

Streaming simplifies the complexity of data collection by introducing a parallel computation approach, which starts the model training while getting the data from the storage.

Streaming Workflow

Getting started

1. Prerequisites

Platforms

AWS EC2 instance properties

A free tier is enough for this use case. Below are the properties of the EC2 instance used for the use case.

  • cloud type: t2.micro.
  • cloud-region: us-east-1a
Other platforms

Other ressources

  • Python 3.9.1
  • DVC 2.11
  • CML
  • You can find all the additional information in the requirements.txt file

Results On DagsHub

DagsHub provides the capabilities to use MLFlow and DVC while giving the choice of working on Github. The following results are the experiments from DagsHub, using MLFlow to track the model F1-Score, Precision and Recall.

MLFlow metrics for each epoch.

The following graphics shows the performances of the models for 3 epochs.

Model Performance for 3 epochs

General workflow

Below is the general workflow of the pipeline.

Model Performance for 3 epochs

Full Article Coming Soon On Medium

Tip!

Press p or to see the previous file or, n or to see the next file

About

Illustrates how to use DagsHub Streaming Client to Train model on AWS EC2 instance

Collaborators 1

Comments

Loading...