To get started with familiarizing yourself with the AI for Continuous Integration (AI4CI) project, check how to Get Started.
In order to quantify and evaluate the current state of the CI workflow, we have started to establish and collect the relevant metrics and key performance indicators (KPIs) needed to measure them. We have a list of metrics that we currently collect. We encourage contributions to this work of developing additional KPIs and metrics.
- In order to include an additional KPI, you can open a KPI Request issue. Specify the KPI you wish to collect with the prefix
- In order to add a notebook to fulfill one of the existing open
KPI Request:issues, you can use the KPI template notebook. The template notebook has helper functions and examples to make contributing new metrics as simple and as uniform as possible.
- When defining the file prefix for your metrics stored in the shared ceph instance, please be sure to use the following format:
- Submit a Pull Request to the project repo with your KPI analysis notebook.
- In order to add the notebook to the automated Kubeflow workflow, follow intructions in the guide.
With the necessary KPIs available to quantify and evaluate the CI workflow, we can start to apply some AI and machine learning techniques to help improve the CI workflow. We encourage you to contribute to this work developing additional machine learning analyses or adding features to the existing analyses.
- In order to include an additional ML Analysis, you can open an ML Request issue. Specify the machine learning application you would like to have included with the prefix
- When uploading your model to the shared ceph storage instance please use the following prefix format to ensure no files get overwritten:
- Submit a Pull Request to the project repo with your ML analysis notebook.
- In order to add the notebook to the automated Kubeflow workflow, follow instructions in the guide.