Skip to content

dashboard: document caveats #33

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 12 additions & 4 deletions dashboard.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,6 @@

This dashboard shows the status of our CI jobs across our four primary repositories.

💡 Note: GitHub only schedules jobs with cron for our `main` branches. The `release-*` branches' jobs do not run in a cron schedule. (see [docs](https://docs.github.com/en/actions/writing-workflows/choosing-when-your-workflow-runs/events-that-trigger-workflows#schedule)).
* As a workaround, we have begun writing secondary jobs that run against `main` and execute with the contents of a release branch. To do this, we hard-code default `pr_or_branch` input values (eg. `release-v0.26`) to `workflow_dispatch`, and then the scheduled job runs against that branch's contents. ([code](https://github.com/instructlab/instructlab/pull/3435)).

| Repository | Branch | Job |
|------------|--------|-----|
| Core | `main` | [![E2E (NVIDIA L4 x1)](https://github.com/instructlab/instructlab/actions/workflows/e2e-nvidia-l4-x1.yml/badge.svg?branch=main)](https://github.com/instructlab/instructlab/actions/workflows/e2e-nvidia-l4-x1.yml) |
Expand Down Expand Up @@ -34,4 +31,15 @@ This dashboard shows the status of our CI jobs across our four primary repositor
|------------|--------|-----|
| Training | `main` | [![E2E (NVIDIA L40S x4)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml/badge.svg?branch=main)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml) |
| | `release-v0.11` | [![E2E (NVIDIA L40S x4)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml/badge.svg?branch=release-v0.11)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml)|
| | `release-v0.10` | [![E2E (NVIDIA L40S x4)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml/badge.svg?branch=release-v0.10)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml)|
| | `release-v0.10` | [![E2E (NVIDIA L40S x4)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml/badge.svg?branch=release-v0.10)](https://github.com/instructlab/training/actions/workflows/e2e-nvidia-l40s-x4.yml)|

## 💡 Notes

GitHub only schedules jobs with cron for our `main` branches. The `release-*` branches' jobs do not run in a cron schedule. (see [docs](https://docs.github.com/en/actions/writing-workflows/choosing-when-your-workflow-runs/events-that-trigger-workflows#schedule)).
* As a workaround, we have begun writing secondary jobs that run against `main` and execute with the contents of a release branch. To do this, we hard-code default `pr_or_branch` input values (eg. `release-v0.26`) to `workflow_dispatch`, and then the scheduled job runs against that branch's contents. ([code](https://github.com/instructlab/instructlab/pull/3435)).

Sometimes the badges in the dashboard will be red because the badges represent *all runs*, not just the plain old vanilla scheduled runs. Specifically, sometimes the badge icons may reflect work-in-progress code changes that developers ran manually with `workflow_dispatch`, and those may be red as the developers try new things. You should click through each badge link for more details on the run over time.

These badges represent the very latest run for a branch. They do not reflect the status over time. You must click through the links to see if the test is flaking over time.

This dashboard only tracks e2e runs, not unit tests or linter tests.