Are you sure you want to delete this access key?
Legend |
---|
DVC Managed File |
Git Managed File |
Metric |
Stage File |
External File |
Legend |
---|
DVC Managed File |
Git Managed File |
Metric |
Stage File |
External File |
Through their own integrations with MLflow, Giskard and Dagshub can be combined to offer a rich environment for ML projects visualisation, tracking and collaboration. For more details on the integration, check this page.
This repository includes 2 notebooks which you can run in Colab to see how Giskard detect vulnerabilities in ML and LLM models.
Giskard is an open-source testing framework dedicated to ML models, covering any Python model, from tabular to LLMs.
Testing Machine Learning applications can be tedious. Since ML models depend on data, testing scenarios depend on the domain specificities and are often infinite.
As discussed on this page, The Giskard-MLflow integration via the evaluation API aims to provide the user with an automated vulnerability detection for tabular and NLP models as well as LLMs.
In conjunction, DagsHub provides a free hosted MLflow server with team-based access control for every repository. You can log experiments with MLflow to it, view the information under the experiment tab, and manage your trained models from the full-fledged MLflow UI built into your DagsHub project. See this page for more details.
Check out an LLM example here: https://dagshub.com/Dean/Giskard-Integration-Demo.mlflow/#/experiments/0/runs/abef3a4e5e2540afb6fd7d0157af7ae5
Press p or to see the previous file or, n or to see the next file
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?