Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
Salil Trehan aad8d294b6
Model Evaluation Module - In Progress
6 days ago
d9588c7855
Initial commit
2 weeks ago
17160f23b9
Toxic packaging
2 weeks ago
6ed91081a0
Small changes
1 week ago
9f92812793
Testing ML FLow pipeline
1 week ago
17160f23b9
Toxic packaging
2 weeks ago
src
aad8d294b6
Model Evaluation Module - In Progress
6 days ago
17160f23b9
Toxic packaging
2 weeks ago
d9588c7855
Initial commit
2 weeks ago
d9588c7855
Initial commit
2 weeks ago
d9588c7855
Initial commit
2 weeks ago
d9588c7855
Initial commit
2 weeks ago
2fd30e48d0
README updated
1 week ago
4727c6449b
Training + Transformation Pipeline Completed
1 week ago
d9588c7855
Initial commit
2 weeks ago
2b1865bc7e
Base Setup
2 weeks ago
d9588c7855
Initial commit
2 weeks ago
d9588c7855
Initial commit
2 weeks ago
29c59d9e02
Ingestion Module Dev Complete
2 weeks ago
b5327fcba3
Some corrections in transform and training pipeline
1 week ago
017a823f24
Added new dependency
1 week ago
edda7cf134
Adding a new run.sh
1 week ago
17160f23b9
Toxic packaging
2 weeks ago
9aa56b5f72
Corrections
1 week ago
71515b9d6e
Ingestion Module Under Development
2 weeks ago
Storage Buckets

README.md

You have to be logged in to leave a comment. Sign In

End-to-End-Chest-Cancer-Classification-using-MLflow-DVC

Workflows

  1. Update config.yaml
  2. Update secrets.yaml [Optional]
  3. Update params.yaml
  4. Update the entity
  5. Update the configuration manager in src config
  6. Update the components
  7. Update the pipeline
  8. Update the main.py
  9. Update the dvc.yaml

MLflow

cmd
  • mlflow ui

dagshub

dagshub

MLFLOW_TRACKING_URI=https://dagshub.com/trehansalil/toxicity_detection.mlflow MLFLOW_TRACKING_USERNAME=trehansalil MLFLOW_TRACKING_PASSWORD=88855b61c077c3a7538eda58ac1a8a33eb4d1098 python script.py

Run this to export as env variables:


export MLFLOW_TRACKING_URI=https://dagshub.com/trehansalil/toxicity_detection.mlflow

export MLFLOW_TRACKING_USERNAME=trehansalil 

export MLFLOW_TRACKING_PASSWORD=88855b61c077c3a7538eda58ac1a8a33eb4d1098

DVC cmd

  1. dvc init
  2. dvc repro
  3. dvc dag

About MLflow & DVC

MLflow

  • Its Production Grade
  • Trace all of your expriements
  • Logging & taging your model

DVC

  • Its very lite weight for POC only
  • lite weight expriements tracker
  • It can perform Orchestration (Creating Pipelines)

AWS-CICD-Deployment-with-Github-Actions

1. Login to AWS console.

2. Create IAM user for deployment

#with specific access

1. EC2 access : It is virtual machine

2. ECR: Elastic Container registry to save your docker image in aws


#Description: About the deployment

1. Build docker image of the source code

2. Push your docker image to ECR

3. Launch Your EC2 

4. Pull Your image from ECR in EC2

5. Lauch your docker image in EC2

#Policy:

1. AmazonEC2ContainerRegistryFullAccess

2. AmazonEC2FullAccess

3. Create ECR repo to store/save docker image

- Save the URI: 566373416292.dkr.ecr.us-east-1.amazonaws.com/chicken

4. Create EC2 machine (Ubuntu)

5. Open EC2 and Install docker in EC2 Machine:

#optinal

sudo apt-get update -y

sudo apt-get upgrade

#required

curl -fsSL https://get.docker.com -o get-docker.sh

sudo sh get-docker.sh

sudo usermod -aG docker ubuntu

newgrp docker

6. Configure EC2 as self-hosted runner:

setting>actions>runner>new self hosted runner> choose os> then run command one by one

7. Setup github secrets:

AWS_ACCESS_KEY_ID=

AWS_SECRET_ACCESS_KEY=

AWS_REGION = us-east-1

AWS_ECR_LOGIN_URI = demo>>  566373416292.dkr.ecr.ap-south-1.amazonaws.com

ECR_REPOSITORY_NAME = simple-app

Project Workflows

  • constants
  • config_entity
  • artifact_entity
  • components
  • pipeline
  • app.py

Setup Methodology for the project

  • Environment Setup
python3 -m venv env
  • Enivironment Setup
source env/bin/activate
  • Installing Requirements file
pip install -r requirements.txt
Tip!

Press p or to see the previous file or, n or to see the next file

About

This is using a Kaggle Data Source and is meant for Educational Purpose only!

Collaborators 1

Comments

Loading...