Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
b4dbdd0de3
Done training
1 year ago
10321fb034
Created using Colaboratory
1 year ago
98e2ce1247
initial commit
1 year ago
src
49936f49dd
updated train.py
1 year ago
b4dbdd0de3
Done training
1 year ago
98e2ce1247
initial commit
1 year ago
98e2ce1247
initial commit
1 year ago
b4dbdd0de3
Done training
1 year ago
0b3129c126
updated dvc.yaml
1 year ago
Storage Buckets
Data Pipeline
Legend
DVC Managed File
Git Managed File
Metric
Stage File
External File

README.md

You have to be logged in to leave a comment. Sign In

Experiment Tracking

Firstly, configure dvc remote either local or remotely(using dagyard)

!dvc repro

Then run the command above to execute the data preprocessing and training experiment pipelines as found in this notebook

Workflow Architecture

This is a high level architecture design of the data preprocessing and training experiment workflow

alt text

Tip!

Press p or to see the previous file or, n or to see the next file

About

This project uses open-source tools(DagsHub, MLflow, DVC) to demonstrate the concept of "models/data management" workflow and process in the MLOps lifecycle

Collaborators 1

Comments

Loading...