Skip to content

Conversation

@bkchr
Copy link
Contributor

@bkchr bkchr commented Dec 22, 2025

It is better to use the already existing infrastructure instead of adding another way we to install dependencies.

This pull request also ensures we are forwarding return values in /cmd spawned processes to e.g. fail on errors.

  • Does not require a CHANGELOG entry

It is better to use the already existing infrastructure instead of adding another way we to install dependencies.

This pull request also ensures we are forwarding return values in `/cmd` spawned processes to e.g. fail on errors.
Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found CI reliability risks from moving solc/resolc installation into install-deps (path assumptions + loss of caching), plus a few correctness/safety issues in the cmd wrapper changes.


Suggestions that couldn't be attached to a specific line

.github/install-deps.sh:3,22-45

set -eu at [Line 3] makes the script sensitive to previously-optional environment variables later in the script (e.g., $AGENT_TOOLSDIRECTORY is used further down). If this script is ever run outside GitHub-hosted runners (or if that var changes), it will hard-fail. Consider guarding destructive deletes with defaults/checks, e.g. if [[ -n "${AGENT_TOOLSDIRECTORY:-}" ]]; then rm -rf "$AGENT_TOOLSDIRECTORY"; fi, or avoid -u here if portability is desired.


.github/install-deps.sh:22-23,33-34,44-45

solc/resolc are installed into relative paths (mkdir -p solc, mkdir -p resolc) and PATH is updated via $PWD ([Line 22-23], [Line 33-34], [Line 44-45]). This is brittle if the workflow runs this script with a different working directory (or via working-directory:). Install into a stable location (e.g. ${GITHUB_WORKSPACE} or ${RUNNER_TEMP}) and append that absolute path to GITHUB_PATH.


.github/install-deps.sh:26-29,37-40

Downloads of solc/resolc are not integrity-checked ([Line 26-29], [Line 37-40]). This is a supply-chain risk since the job is executing downloaded binaries. Consider validating an expected SHA256 (pinned per version) before chmod +x, or downloading from a provenance-verified source if available.


.github/actions/get-solc/action.yml:15-21 removed

The deleted composite action previously cached solc and resolc ([Line 15-21 removed]). With the move to .github/install-deps.sh, the caching behavior appears removed, which can significantly increase CI time and add a new failure mode (network flakiness) for every job. If the intent is parity with the prior approach, add an equivalent cache step where install-deps.sh is used, or incorporate caching into the shared infra you’re consolidating on.


.github/workflows/clippy.yml:40-41 removed

The Install Solc step was removed ([Line 40-41 removed]) without any visible replacement in this diff. If clippy builds (directly or via build scripts) require solc/resolc, this job will start failing. Ensure this workflow invokes .github/install-deps.sh (or otherwise provisions solc/resolc) before running cargo clippy.


.github/workflows/test.yml:100-101 removed

The Install Solc step was removed ([Line 100-101 removed]) without any visible replacement in this diff. If any tests/build steps rely on solc/resolc, this will cause runtime failures. Ensure install-deps.sh is executed in this job before tests that require these tools.

Comment on lines 99 to 102
status = os.system(f"cargo build -p {runtime['package']} --profile {profile} -q --features {features}")
if status != 0:
print(f"Failed to build {runtime['name']}")
sys.exit(1)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

os.system(...) return value handling is not quite correct: it returns a wait status, not the raw exit code. Checking status != 0 works for failure detection, but you lose the real exit code and can misreport signal terminations. Prefer subprocess.run(..., check=True) (already imported) or decode via os.WEXITSTATUS(status) when printing/debugging ([Line 99-102]).

Comment on lines 105 to 107
result = subprocess.run(
f"frame-omni-bencher v1 benchmark pallet --no-csv-header --all --list --runtime={wasm_file}",
shell=True, capture_output=True, text=True)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

subprocess.run(..., shell=True, ...) at [Line 105-107] is unnecessary and increases injection risk. Even if inputs come from a checked-in JSON, this still expands shell metacharacters and can break on spaces. Prefer passing argv as a list with shell=False, e.g. subprocess.run(["frame-omni-bencher", "v1", "benchmark", "pallet", "--no-csv-header", "--all", "--list", f"--runtime={wasm_file}"]).

print('❌ Failed to format code')
sys.exit(1)
command = "taplo format --config .config/taplo.toml"
print(f'Formatting toml files with with `{command}`')

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo in user-facing output: Formatting toml files with with ... ([Line 206]). Fix to a single with to avoid noisy CI logs.

Copy link
Member

@xlc xlc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still don't fully understand why solc is needed for runtimes?

@sigurpol
Copy link
Contributor

I still don't fully understand why solc is needed for runtimes?

AFAIK (~0), it is only needed for benchmarks / tests , not for production builds. This PR aims to fix couple of issues found while running benchmarks see https://github.com/polkadot-fellows/runtimes/actions/runs/20395561412/job/58610645713

  1. compilation fails due to missing solc compiler in the logs above
  2. ...but despite the epic fail, we return success 😓

@pandres95
Copy link
Member

I still don't fully understand why solc is needed for runtimes?

Because of some revive dependencies that stubbornly depend upon it.

@athei do you think there might be a way to work around those dependencies, such that end users don't really need to know about this? Or is it something that we might see on stable2512?

@athei
Copy link
Contributor

athei commented Dec 24, 2025

It is required to compile test and benchmarking fixtures. Hence the dependency is required if you either supply test or feature = runtime-benchmarks. I wouldn't call this stubborn. We minimized the impact as much as possible. What is your suggestion? Checking in binaries?

Tests are not really relevant for downstream users. So only the benchmarking fixtures is what is problematic here. I believe it is only a handful of them. So maybe we should actually check in the binaries?

@bkchr bkchr merged commit fb8fc02 into main Dec 24, 2025
53 of 69 checks passed
@bkchr bkchr deleted the bkchr-use-install-deps branch December 24, 2025 08:54
@pandres95
Copy link
Member

pandres95 commented Jan 3, 2026

It is required to compile test and benchmarking fixtures. Hence the dependency is required if you either supply test or feature = runtime-benchmarks. I wouldn't call this stubborn. We minimized the impact as much as possible. What is your suggestion? Checking in binaries?

Yeah! I thought about what was there a few months ago, because yes… there were several internal dependencies to revive fixtures that you needed to have in order to just build, glad to know things have changed quite much. ❤️

Tests are not really relevant for downstream users. So only the benchmarking fixtures is what is problematic here. I believe it is only a handful of them. So maybe we should actually check in the binaries?

That might be a solution, considering it's only a handful of benchmarking tests. Or maybe allowing to setup these binaries via the benchmarking setup. 🤔 But definitely is just an idea, because that might be probably too intricate just to remove a dependency that is not used on any workflow, other than benchmarking.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants