Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,091 changes: 589 additions & 502 deletions .github/workflows/openvmm-pr.yaml

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions Cargo.lock
Original file line number Diff line number Diff line change
Expand Up @@ -2061,6 +2061,7 @@ dependencies = [
"serde",
"serde_json",
"target-lexicon",
"toml_edit",
"vmm_test_images",
"which 8.0.0",
]
Expand Down
136 changes: 136 additions & 0 deletions Guide/src/dev_guide/dev_tools/flowey/pipelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,3 +69,139 @@ Unstable parameters are for **internal use** and experimentation:
- Experimental features or debugging flags
- Internal pipeline configuration that may change frequently
- Parameters for development/testing that shouldn't be used in production

## Cross-Job Conditions

Sometimes a lightweight job needs to compute a value that other jobs use to
decide whether to run. Flowey provides two complementary tools for this:

### `Pipeline::gh_job_id_of` / `Pipeline::ado_job_id_of`

These methods return the stable CI job identifier for a given
[`PipelineJobHandle`], allowing you to reference the job in condition
expressions without hard-coding an ID:

```rust,ignore
let classify_handle = classify_job.finish();

// GitHub condition that references the classify job's output:
let gh_cond = format!(
"needs.{}.outputs.my_output != 'true'",
pipeline.gh_job_id_of(&classify_handle)
);

// ADO condition:
let ado_cond = format!(
"and(succeeded(), ne(dependencies.{}.outputs['step.my_output'], 'true'))",
pipeline.ado_job_id_of(&classify_handle)
);
```

For GitHub, `gh_job_id_of` returns the auto-generated `job{N}` ID. For ADO,
`ado_job_id_of` returns the override set with `ado_override_job_id`, or
`job{N}` otherwise.

### `PipelineJob::gh_set_job_output_from_env_var`

Declares a GitHub Actions job-level output backed by a `$GITHUB_ENV` variable
that a Rust step writes at runtime. Dependent jobs read it via
`needs.<job>.outputs.<name>`:

```rust,ignore
let classify_job = pipeline
.new_job(platform, arch, "classify PR changes")
.gh_set_job_output_from_env_var(
"is_non_product", // output name
check_pr_changes::GH_ENV_IS_NON_PRODUCT, // env var name constant
)
.dep_on(|ctx| check_pr_changes::Request {
done: ctx.new_done_handle(),
})
.finish();
```

### `PipelineJob::ado_dangerous_override_if`

Replaces the default `condition:` for an ADO job. Use this when a job should
only run if a cross-job output variable has a particular value:

```rust,ignore
vmm_tests_job.ado_dangerous_override_if(
"and(succeeded(), not(canceled()), \
ne(dependencies.JOB.outputs['STEP.is_non_product'], 'true'))"
)
```

## PR Change Classification (`check_pr_changes`)

`flowey_lib_hvlite::check_pr_changes` is a thin Flowey node that classifies
the PR's changed files and communicates the result to downstream jobs via
backend-native mechanisms — no GitHub API call, no external scripts.

### How it works

The same Rust classification code runs on all three backends. The only
backend-specific part is *where* the result is written so that downstream
jobs can read it:

| Backend | Classification | Cross-job result |
|---------|---------------|-----------------|
| GitHub | `git diff origin/$GITHUB_BASE_REF...HEAD` | Written to `$GITHUB_ENV` as `FLOWEY_IS_NON_PRODUCT`; declared as a job output |
| ADO | `git diff origin/$SYSTEM_PULLREQUEST_TARGETBRANCH...HEAD` | Published via `##vso[task.setvariable;isOutput=true]` |
| Local | Always "product" (conservative) | N/A |

### Non-product buckets

Bucket patterns are defined in
`flowey/flowey_lib_hvlite/src/non_product_config.toml`. A PR is
classified as non-product only when **every** changed file matches at
least one bucket pattern. Any unmatched file — or any classification
error — conservatively marks the PR as a product change.

The current buckets are:

| Pattern | Rationale |
|---------|-----------|
| `Guide/**` | Docs tree; validated by the separate docs pipeline |
| `repo_support/**/*.py` | Repo automation scripts; no effect on product behavior |

To add a new non-product bucket, **edit `non_product_config.toml`**.
All backends read from the same parsed config.

### Pipeline usage

```rust,ignore
// 1. Create the classify job
let classify_job = pipeline
.new_job(FlowPlatform::Linux(...), FlowArch::X86_64, "classify PR changes")
.gh_set_pool(gh_hosted_x64_linux())
.gh_set_job_output_from_env_var("is_non_product", GH_ENV_IS_NON_PRODUCT)
.dep_on(|ctx| check_pr_changes::Request {
done: ctx.new_done_handle(),
})
.finish();

// 2. Pre-compute condition strings (before mutable borrows from new_job())
let gh_skip_cond = format!(
"needs.{}.outputs.is_non_product != 'true' && github.event.pull_request.draft == false",
pipeline.gh_job_id_of(&classify_job)
);
let ado_skip_cond = check_pr_changes::ado_condition(&pipeline.ado_job_id_of(&classify_job));

// 3. Gate expensive jobs on the classification
let vmm_tests = pipeline
.new_job(...)
.gh_dangerous_override_if(&gh_skip_cond)
.ado_dangerous_override_if(&ado_skip_cond)
.dep_on(...)
.finish();

pipeline.non_artifact_dep(&vmm_tests, &classify_job);
```

```admonish note
`gh_dangerous_override_if` / `ado_dangerous_override_if` **replace** the
default job condition entirely. Always include the full guard (e.g. the
draft-PR check for GitHub, and `succeeded(), not(canceled())` for ADO)
alongside the classification guard.
```
Loading
Loading