- Update config.yaml
- Update secrets.yaml [Optional]
- Update params.yaml
- Update the entity
- Update the configuration manager in src config
- Update the components
- Update the pipeline
- Update the main.py
- Update the dvc.yaml
- app.py
https://github.com/ishumann/Kidney-Disease-Classification
conda create -n kidney python=3.8 -y
conda activate kidney
conda create -p kidney python=3.8 -y
pip install -r requirements.txt
mlflow ui
- add github Repo to Dagshub and get following credintials
[dagshub](https://dagshub.com/)
export MLFLOW_TRACKING_URI=https://dagshub.com/ishumann/Kidney-Disease-Classification.mlflow
export MLFLOW_TRACKING_USERNAME=ishumann
export MLFLOW_TRACKING_PASSWORD=15495e98cd344df629c4d470b95d7bac88585699
note: this credintials are not real. Create your own account and get the credintials from dagshub.
1. dvc init
2. dvc repro
3. dvc dag
MLflow
- Its Production Grade
- Trace all of your expriements
- Logging & taging your model
DVC
- Its very lite weight for POC only
- lite weight expriements tracker
- It can perform Orchestration (Creating Pipelines)
#with specific access
1. EC2 access : It is virtual machine
2. ECR: Elastic Container registry to save your docker image in aws
#Description: About the deployment
1. Build docker image of the source code
2. Push your docker image to ECR
3. Launch Your EC2
4. Pull Your image from ECR in EC2
5. Lauch your docker image in EC2
#Policy:
1. AmazonEC2ContainerRegistryFullAccess
2. AmazonEC2FullAccess
- Save the URI: 050407812497.dkr.ecr.us-east-1.amazonaws.com/kidney
#optinal
sudo apt-get update -y
sudo apt-get upgrade
#required
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo usermod -aG docker ubuntu
newgrp docker
setting>actions>runner>new self hosted runner> choose os> then run command one by one
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION = us-east-1
AWS_ECR_LOGIN_URI =
ECR_REPOSITORY_NAME =