-
Notifications
You must be signed in to change notification settings - Fork 624
[feat] Integration e2e test tools #1694
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Conversation
WalkthroughA suite of new files is introduced to support integration and end-to-end testing for a blockchain rollup system. This includes Go utilities for importing and organizing rollup data into a database, a command-line tool with configuration management, and supporting scripts, configuration files, and Docker Compose setup for local database management and data preparation. Additionally, minor updates were made to the zkvm-prover build and release scripts. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant CLI Tool (main.go)
participant Config Loader
participant Import Logic (imports.go)
participant Database
User->>CLI Tool (main.go): Run with args & config
CLI Tool (main.go)->>Config Loader: Load config.json
Config Loader-->>CLI Tool (main.go): DB config
CLI Tool (main.go)->>Import Logic (imports.go): importData(beginBlk, endBlk, ...)
Import Logic (imports.go)->>Database: Connect and partition blocks
Import Logic (imports.go)->>Database: Import chunks, batches, bundles
Import Logic (imports.go)-->>CLI Tool (main.go): importRecord (hashes)
CLI Tool (main.go)->>User: Write output JSON
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~18 minutes Suggested reviewers
Poem
Note ⚡️ Unit Test Generation is now available in beta!Learn more here, or try it out under "Finishing Touches" below. 📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (4)
tests/prover-e2e/config.json (1)
1-8
: Consider environment variable usage for database credentials.The hardcoded credentials are acceptable for local testing, but consider using environment variables for better flexibility and security practices, even in test environments.
{ "db_config": { "driver_name": "postgres", - "dsn": "postgres://dev:dev@localhost:5432/scroll?sslmode=disable", + "dsn": "postgres://${DB_USER:-dev}:${DB_PASSWORD:-dev}@${DB_HOST:-localhost}:${DB_PORT:-5432}/${DB_NAME:-scroll}?sslmode=disable", "maxOpenNum": 5, "maxIdleNum": 1 } }crates/libzkp_c/src/lib.rs (2)
74-88
: Consider safety and portability improvements for debug dumping.While the debug dumping functionality is useful, there are some concerns:
- Unsafe C string conversion: Line 82 uses unsafe code to convert the raw pointer without validation
- Hardcoded paths:
/tmp/
is not portable across all platforms (Windows doesn't have/tmp/
)- Potential race conditions: Multiple processes could generate the same timestamp
Consider these improvements:
- let filename = format!("/tmp/proof_{}.json", timestamp); + let temp_dir = std::env::temp_dir(); + let filename = temp_dir.join(format!("proof_{}.json", timestamp)); let cstr = unsafe { std::ffi::CStr::from_ptr(proof_str) }; - if let Err(e) = std::fs::write(&filename, cstr.to_bytes()) { + if let Err(e) = std::fs::write(&filename, cstr.to_bytes()) { - eprintln!("Failed to write proof to file {}: {}", filename, e); + eprintln!("Failed to write proof to file {}: {}", filename.display(), e); } else { - println!("Dumped failed proof to {}", filename); + println!("Dumped failed proof to {}", filename.display()); }
198-212
: Apply consistent portability improvements for task dumping.Same portability and safety concerns as the proof dumping functionality.
Apply similar improvements for consistent cross-platform compatibility:
- let filename = format!("/tmp/task_{}_{}.json", c_str.to_str().unwrap(), timestamp); + let temp_dir = std::env::temp_dir(); + let filename = temp_dir.join(format!("task_{}_{}.json", c_str.to_str().unwrap_or("unknown"), timestamp)); - if let Err(e) = std::fs::write(&filename, task_json.as_bytes()) { + if let Err(e) = std::fs::write(&filename, task_json.as_bytes()) { - eprintln!("Failed to write task to file {}: {}", filename, e); + eprintln!("Failed to write task to file {}: {}", filename.display(), e); } else { - println!("Dumped failed task to {}", filename); + println!("Dumped failed task to {}", filename.display()); }tests/prover-e2e/docker-compose.yml (1)
1-22
: Fix minor formatting issue.The Docker Compose configuration is well-structured for local development with appropriate security considerations (localhost-only binding). However, there's a missing newline at the end of the file.
Add a newline at the end of the file:
restart: unless-stopped +
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
Cargo.lock
is excluded by!**/*.lock
📒 Files selected for processing (16)
Cargo.toml
(1 hunks)coordinator/Makefile
(1 hunks)coordinator/internal/logic/provertask/batch_prover_task.go
(1 hunks)coordinator/internal/logic/provertask/bundle_prover_task.go
(3 hunks)coordinator/internal/logic/provertask/chunk_prover_task.go
(2 hunks)crates/l2geth/src/rpc_client.rs
(1 hunks)crates/libzkp/src/lib.rs
(3 hunks)crates/libzkp_c/src/lib.rs
(3 hunks)crates/prover-bin/src/main.rs
(3 hunks)rollup/tests/integration_tool/imports.go
(1 hunks)rollup/tests/integration_tool/main.go
(1 hunks)tests/prover-e2e/.env
(1 hunks)tests/prover-e2e/.gitignore
(1 hunks)tests/prover-e2e/Makefile
(1 hunks)tests/prover-e2e/config.json
(1 hunks)tests/prover-e2e/docker-compose.yml
(1 hunks)
🧰 Additional context used
🧠 Learnings (8)
tests/prover-e2e/.gitignore (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
tests/prover-e2e/.env (2)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
Learnt from: colinlyguo
PR: scroll-tech/scroll#1645
File: rollup/proposer-tool-config.json:34-40
Timestamp: 2025-04-15T08:52:44.176Z
Learning: In configuration files like `rollup/proposer-tool-config.json`, placeholders such as `<mainnet read db config>` are intentionally left as-is to be replaced by users with their own configuration values when deploying the tool.
coordinator/internal/logic/provertask/batch_prover_task.go (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: rollup/internal/controller/watcher/batch_proposer.go:291-294
Timestamp: 2024-10-20T16:13:20.397Z
Learning: In `batch_proposer.go`, it's acceptable to call `utils.CalculateBatchMetrics` multiple times within the loop because the batch's chunks are increasing in the loop, and each calculation reflects the updated batch state.
tests/prover-e2e/docker-compose.yml (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
tests/prover-e2e/config.json (2)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1645
File: rollup/proposer-tool-config.json:34-40
Timestamp: 2025-04-15T08:52:44.176Z
Learning: In configuration files like `rollup/proposer-tool-config.json`, placeholders such as `<mainnet read db config>` are intentionally left as-is to be replaced by users with their own configuration values when deploying the tool.
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
coordinator/internal/logic/provertask/bundle_prover_task.go (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: rollup/internal/controller/watcher/batch_proposer.go:291-294
Timestamp: 2024-10-20T16:13:20.397Z
Learning: In `batch_proposer.go`, it's acceptable to call `utils.CalculateBatchMetrics` multiple times within the loop because the batch's chunks are increasing in the loop, and each calculation reflects the updated batch state.
tests/prover-e2e/Makefile (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
rollup/tests/integration_tool/imports.go (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
🧬 Code Graph Analysis (4)
crates/libzkp_c/src/lib.rs (2)
crates/libzkp_c/src/utils.rs (1)
c_char_to_vec
(8-11)crates/libzkp/src/lib.rs (1)
verify_proof
(119-124)
coordinator/internal/logic/provertask/batch_prover_task.go (1)
coordinator/internal/logic/provertask/prover_task.go (1)
ErrCoordinatorInternalFailure
(26-26)
coordinator/internal/logic/provertask/chunk_prover_task.go (1)
coordinator/internal/logic/provertask/prover_task.go (1)
ErrCoordinatorInternalFailure
(26-26)
rollup/tests/integration_tool/main.go (4)
database/config.go (1)
DBConfig
(12-19)common/version/version.go (1)
Version
(31-31)common/utils/flags.go (2)
CommonFlags
(9-20)ConfigFileFlag
(30-34)common/utils/logger.go (1)
LogSetup
(15-42)
🪛 checkmake (0.2.2)
coordinator/Makefile
[warning] 1-1: Missing required phony target "all"
(minphony)
[warning] 1-1: Missing required phony target "test"
(minphony)
tests/prover-e2e/Makefile
[warning] 1-1: Missing required phony target "all"
(minphony)
[warning] 1-1: Missing required phony target "test"
(minphony)
[warning] 7-7: Target "all" should be declared PHONY.
(phonydeclared)
🪛 Checkov (3.2.334)
tests/prover-e2e/docker-compose.yml
[MEDIUM] 6-7: Basic Auth Credentials
(CKV_SECRET_4)
tests/prover-e2e/config.json
[MEDIUM] 4-5: Basic Auth Credentials
(CKV_SECRET_4)
🪛 YAMLlint (1.37.1)
tests/prover-e2e/docker-compose.yml
[error] 22-22: no new line character at the end of file
(new-line-at-end-of-file)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: tests
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: tests
- GitHub Check: check
🔇 Additional comments (20)
crates/l2geth/src/rpc_client.rs (1)
111-111
: Enhanced error message improves debugging capabilities.The inclusion of the block hash in the error message provides better traceability when block fetching fails, which is particularly valuable for the e2e testing infrastructure being introduced.
Cargo.toml (1)
20-22
: Git revision hashes verifiedThe commit
0f62338
was confirmed to exist in the scroll-tech/zkvm-prover repository (refs/heads/master), so the updated dependency revisions are valid. Approving the Cargo.toml changes.coordinator/Makefile (1)
1-1
: Correctly adds libzkp as a phony target.The addition of
libzkp
to the.PHONY
target list is appropriate as it ensures the target is always executed regardless of file existence. This aligns with the build orchestration needs for the e2e testing infrastructure.tests/prover-e2e/.env (1)
1-4
: Environment configuration is consistent with test setup.The
.env
file provides appropriate configuration for the Goose migration tool in the e2e testing environment. The settings are consistent with theconfig.json
file and suitable for local testing.coordinator/internal/logic/provertask/batch_prover_task.go (1)
87-95
: LGTM! Consistent task assignment pattern implemented.The direct task retrieval by TaskID is well-implemented with proper error handling. The logic correctly distinguishes between database errors (returning
ErrCoordinatorInternalFailure
) and missing tasks (returning a descriptive error message).crates/libzkp_c/src/lib.rs (1)
12-20
: LGTM! Environment-based debug control implemented correctly.The
enable_dump()
function properly usesOnceLock
for thread-safe lazy initialization and checks appropriate environment variables.coordinator/internal/logic/provertask/chunk_prover_task.go (2)
83-91
: LGTM! Consistent task assignment pattern maintained.The direct task retrieval logic matches the implementation in
BatchProverTask
, ensuring consistency across different prover task types.
234-234
: Error formatting change noted.The change from
%w
to%v
removes error wrapping fordbErr
. This is likely intentional to simplify the error chain, but ensure this aligns with your error handling strategy.crates/prover-bin/src/main.rs (3)
42-46
: LGTM! Clear command structure defined.The new
Handle
command is well-defined with appropriate documentation.
48-53
: LGTM! Simple and effective data structure.The
HandleSet
struct provides a clean interface for deserializing task collections from JSON.
77-110
: LGTM! Well-structured sequential task processing.The implementation properly:
- Handles file I/O with appropriate error propagation
- Uses
Arc
for safe prover sharing across async operations- Processes task sets sequentially with clear progress indicators
- Uses assertions to validate each step's success
coordinator/internal/logic/provertask/bundle_prover_task.go (2)
85-93
: LGTM! Good addition for specific task retrieval.The implementation correctly handles fetching a bundle by its specific hash when a TaskID is provided, with appropriate error handling for both database errors and missing tasks.
246-254
: Good optimization for genesis batch handling.The conditional check prevents unnecessary database queries for parent batches when dealing with the first batch (index 0 or 1), correctly leaving
prevStateRoot
as zero hash.rollup/tests/integration_tool/main.go (2)
49-85
: Well-implemented validation function.The
parseThreeIntegers
function correctly validates the input format, ensures positive values, and enforces descending order as required.
131-190
: Well-structured main action implementation.The action function properly validates arguments, handles codec version selection, generates random seeds when needed, and correctly marshals the output to JSON with proper error handling throughout.
tests/prover-e2e/Makefile (1)
12-29
: Robust PostgreSQL setup with proper timeout handling.The database setup correctly waits for PostgreSQL to be ready with a reasonable timeout, and applies migrations in the correct order.
crates/libzkp/src/lib.rs (1)
51-72
: Good improvement for fork name handling.The normalization of fork names to lowercase and conversion to the
ForkName
enum ensures consistent handling across all task types. The assertion provides a safety check that the normalized name matches the expected input.rollup/tests/integration_tool/imports.go (3)
26-40
: Correct implementation of random selection algorithm.The
randomPickKfromN
function correctly implements random selection using Fisher-Yates shuffle, returning k-1 sorted indices from the range [1, n-1].
42-117
: Well-structured data import implementation.The import logic correctly:
- Uses seeded randomization for reproducible test data generation
- Maintains proper hierarchical relationships between chunks, batches, and bundles
- Updates cross-references in the database after each level of import
- Handles errors appropriately at each stage
119-213
: Solid implementation of import helper functions.Each import function correctly:
- Handles database operations with proper error checking
- Maintains referential integrity between entities
- Updates related records after insertion
- Provides informative logging for debugging
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #1694 +/- ##
===========================================
- Coverage 39.59% 39.03% -0.57%
===========================================
Files 237 239 +2
Lines 19062 19336 +274
===========================================
Hits 7547 7547
- Misses 10771 11045 +274
Partials 744 744
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (3)
crates/libzkp/src/tasks/batch.rs (1)
135-162
: Good addition of V8 support and fix for V7 padding.The implementation correctly:
- Adds fork name validation using assert_eq!
- Fixes the missing padding logic for V7
- Implements V8 support following the same pattern
Consider extracting the duplicated padding logic into a helper function:
+fn pad_blob_bytes(blob_bytes: &[u8]) -> Vec<u8> { + let mut padded_blob_bytes = blob_bytes.to_vec(); + padded_blob_bytes.resize(N_BLOB_BYTES, 0); + padded_blob_bytes +} BatchHeaderV::V7(_) => { assert_eq!( fork_name, ForkName::EuclidV2, "hardfork mismatch for da-codec@v7 header: found={fork_name:?}, expected={:?}", ForkName::EuclidV2, ); - let padded_blob_bytes = { - let mut padded_blob_bytes = self.blob_bytes.to_vec(); - padded_blob_bytes.resize(N_BLOB_BYTES, 0); - padded_blob_bytes - }; + let padded_blob_bytes = pad_blob_bytes(&self.blob_bytes); <EnvelopeV7 as Envelope>::from_slice(padded_blob_bytes.as_slice()) .challenge_digest(versioned_hash) } BatchHeaderV::V8(_) => { assert_eq!( fork_name, ForkName::Feynman, "hardfork mismatch for da-codec@v8 header: found={fork_name:?}, expected={:?}", ForkName::Feynman, ); - let padded_blob_bytes = { - let mut padded_blob_bytes = self.blob_bytes.to_vec(); - padded_blob_bytes.resize(N_BLOB_BYTES, 0); - padded_blob_bytes - }; + let padded_blob_bytes = pad_blob_bytes(&self.blob_bytes); <EnvelopeV8 as Envelope>::from_slice(padded_blob_bytes.as_slice()) .challenge_digest(versioned_hash) }tests/prover-e2e/Makefile (2)
1-1
: Addimport_data
to the.PHONY
list.The target never produces a file named
import_data
, so it should be marked PHONY to avoid accidental caching.-.PHONY: clean setup_db test_tool all +.PHONY: clean setup_db test_tool import_data all
15-27
: Consider replacing the handcraftednc
polling loop with Docker health-checks (ordocker compose wait
).Manually looping with
nc
works but:
- Requires
nc
to be present on the host.- Adds 60 s of fixed latency even when Postgres is up quickly.
- Duplicates functionality already available via container health-checks.
A more robust pattern:
# docker-compose.yml (excerpt) services: db: image: postgres:16 healthcheck: test: ["CMD-SHELL", "pg_isready -U $$POSTGRES_USER"] interval: 2s retries: 30Then in the Makefile:
docker compose up --detach docker compose wait dbThis removes the dependency on
nc
and leverages Docker’s native retry logic.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
crates/libzkp/src/lib.rs
(2 hunks)crates/libzkp/src/tasks.rs
(2 hunks)crates/libzkp/src/tasks/batch.rs
(4 hunks)tests/prover-e2e/Makefile
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- crates/libzkp/src/lib.rs
🧰 Additional context used
🧠 Learnings (1)
tests/prover-e2e/Makefile (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
🧬 Code Graph Analysis (1)
crates/libzkp/src/tasks.rs (1)
crates/libzkp/src/utils.rs (1)
panic_catch
(43-53)
🪛 checkmake (0.2.2)
tests/prover-e2e/Makefile
[warning] 1-1: Missing required phony target "test"
(minphony)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: tests
- GitHub Check: tests
- GitHub Check: check
🔇 Additional comments (4)
crates/libzkp/src/tasks.rs (2)
12-15
: LGTM: Clean import refactoring.The addition of
utils::panic_catch
import supports the standardization of panic handling across the codebase, improving maintainability.
26-33
: LGTM: Improved panic handling with centralized utility.The refactoring from manual
std::panic::catch_unwind
usage to thepanic_catch
utility function is a good improvement that:
- Eliminates boilerplate panic payload extraction code
- Standardizes panic handling across the codebase
- Maintains the same functionality while improving readability
The error mapping from
String
toeyre::Error
is correct and preserves the original behavior.crates/libzkp/src/tasks/batch.rs (1)
25-29
: Verify the ordering requirement for V8 variant in untagged enum.The existing comment mentions that V6 must be placed above V7 due to untagged enum deserialization order. Please confirm if there are any ordering constraints for the V8 variant relative to V6 and V7 to ensure correct deserialization.
tests/prover-e2e/Makefile (1)
28-29
: Double-running migrations looks backwards – verify the intent.
goose up
already applies all pending migrations; immediately issuing
GOOSE_MIGRATION_DIR=./ ${GOOSE_CMD} up-to 100
afterwards is either a no-op (if head ≥ 100) or can silently downgrade expectations if new migrations > 100 appear later.Confirm the real goal:
- If you only need up to version 100, drop the first
goose up
.- If you want the latest schema, drop the
up-to
line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (2)
tests/prover-e2e/Makefile (1)
38-38
: Build step still lacks pre-created directory – see previous reviewThe earlier comment about ensuring
build/bin
exists beforego build
has not been addressed.crates/libzkp/src/tasks/batch.rs (1)
38-57
: Fix misleading error messages in header accessor methods.The panic messages in the header accessor methods are incorrect and misleading. They should indicate what header type was expected, not a generic "try to pick other header type" message.
🧹 Nitpick comments (3)
zkvm-prover/Makefile (1)
44-45
: Path coupling to tests directory is brittle
E2E_HANDLE_SET = ../tests/prover-e2e/testset.json
tightly couples the prover Makefile to a sibling tests folder.
Moving either directory will silently break the rule.Expose the handle set via a variable that callers can override:
-E2E_HANDLE_SET = ../tests/prover-e2e/testset.json +E2E_HANDLE_SET ?= ../tests/prover-e2e/testset.jsonThis keeps the default while allowing
make test_e2e_run E2E_HANDLE_SET=/tmp/handles.json
.Also applies to: 75-76
tests/prover-e2e/Makefile (1)
1-2
: Declare all frequently-invoked targets as PHONY
import_data
andimport_data_euclid
are missing from the.PHONY
list, and the static checker also expects atest
phony target.-.PHONY: clean setup_db test_tool all check_vars +.PHONY: all clean setup_db test_tool import_data import_data_euclid check_vars testcrates/libzkp/src/tasks/batch.rs (1)
169-173
: Consider refactoring duplicate fork name matching logic.The fork name matching pattern appears twice in this method (lines 129-143 and 169-173). Consider extracting this into a helper method to reduce duplication and ensure consistency.
Example helper method:
+impl BatchHeaderV { + pub fn to_reference_header(&self, fork_name: ForkName) -> ReferenceHeader { + match fork_name { + ForkName::EuclidV1 => ReferenceHeader::V6(*self.must_v6_header()), + ForkName::EuclidV2 => ReferenceHeader::V7(*self.must_v7_header()), + ForkName::Feynman => ReferenceHeader::V8(*self.must_v8_header()), + } + } +}Then simplify the code:
-let reference_header = match fork_name { - ForkName::EuclidV1 => ReferenceHeader::V6(*self.batch_header.must_v6_header()), - ForkName::EuclidV2 => ReferenceHeader::V7(*self.batch_header.must_v7_header()), - ForkName::Feynman => ReferenceHeader::V8(*self.batch_header.must_v8_header()), -}; +let reference_header = self.batch_header.to_reference_header(fork_name);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
crates/libzkp/src/tasks/batch.rs
(4 hunks)tests/prover-e2e/Makefile
(1 hunks)zkvm-prover/Makefile
(3 hunks)
🧰 Additional context used
🪛 checkmake (0.2.2)
tests/prover-e2e/Makefile
[warning] 1-1: Missing required phony target "test"
(minphony)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: tests
🔇 Additional comments (5)
zkvm-prover/Makefile (1)
6-6
: Confirm testset.json presence and circuit coverageI couldn’t locate
tests/prover-e2e/testset.json
in the repo (the Makefile’sE2E_HANDLE_SET
still points to it), so please:
- Verify the correct path or filename for your end-to-end handle set (it may have been renamed to
config.json
or moved).- Once the test set is found, ensure its
"type"
values are limited tochunk
.
- If it includes
batch
orbundle
, updateCIRCUIT_STUFF
inzkvm-prover/Makefile
(line 6) to also list the corresponding.vmexe
files.tests/prover-e2e/Makefile (1)
18-34
:nc
dependency may not exist in minimal CI images
nc -z localhost 5432
is great locally but fails on Alpine-based Docker or busybox images withoutnetcat
.Consider switching to
pg_isready
(ships with Postgres) or add an explicit check/install step.crates/libzkp/src/tasks/batch.rs (3)
129-143
: Well-structured fork handling with clear error messages.The fork-specific envelope handling is well-organized with explicit matches and informative error messages for unexpected fork names.
123-127
: Correct blob padding for V7/V8 envelopes.The padding logic appropriately resizes blob bytes to
N_BLOB_BYTES
only for V7/V8 envelopes, maintaining backward compatibility with V6.
25-28
: Unable to locateBatchHeaderV7
/BatchHeaderV8
definitionsI ran exhaustive searches for both structs and any alias in the repo but didn’t find them. Please manually verify the intended design of your enum variant:
- Confirm where (and whether)
BatchHeaderV7
andBatchHeaderV8
are defined.- If they are distinct types, split your enum into separate
V7
andV8
variants.- If they’re structurally identical, consider using a single struct with a version tag or a type alias for clarity.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the PR looks good. But i prefer splitting it into 2 PRs: one is the "core" changes for Feynman, the other is the testing tools.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (5)
crates/libzkp/Cargo.toml (1)
12-12
: Disable default features to avoid unintended depsUnless you explicitly rely on the
sbv-primitives
default feature set, it’s safer to turn it off when adding a custom feature list; otherwise you may silently pull in heavy or conflicting dependencies (e.g.native-keccak
, which the comment on line 11 tries to avoid).-sbv-primitives = {workspace = true, features = ["scroll-compress-ratio", "scroll"]} +sbv-primitives = {workspace = true, default-features = false, features = ["scroll-compress-ratio", "scroll"]}Please double-check that no other crate in the workspace depends on those default features before applying.
crates/prover-bin/src/prover.rs (1)
206-208
: Good safety check, but consider improving error handling and cache management.The validation to ensure
vks
cache is empty before dumping verifier assets is a sound safety measure that prevents potential issues with stale or incorrect cached verification keys.However, consider these improvements:
More actionable error message: The current message suggests "clean vks cache first" but doesn't provide guidance on how to do this.
Cache management inconsistency: The method later calls
get_vk_and_cache()
which may populate the cache, but there's no mechanism to clear it beforehand.Consider this enhanced approach:
if !config.vks.is_empty() { - eyre::bail!("clean vks cache first or we will have wrong dumped vk"); + eyre::bail!( + "VKs cache is not empty. Clear the cache in the configuration before dumping verifier assets to ensure accuracy. Found cached keys: {:?}", + config.vks.keys().collect::<Vec<_>>() + ); }Additionally, consider providing a method to clear the cache or documenting how users should handle this scenario.
tests/prover-e2e/prepare/dump_block_records.sql (1)
25-25
: Consider making the block range configurable.The hardcoded block range (16523677 to 16523700) limits the script's reusability for different test scenarios. Consider parameterizing these values or documenting why this specific range was chosen.
For better reusability, consider adding variables at the top:
+-- Configuration +\set start_block 16523677 +\set end_block 16523700 + -- Create a file with INSERT statements for the specific recordsThen update the WHERE clause:
-WHERE number >= 16523677 and number <= 16523700 +WHERE number >= :start_block and number <= :end_blockrollup/tests/integration_tool/imports.go (2)
27-41
: Add documentation for the random partitioning logic.The
randomPickKfromN
function implements a specific partitioning algorithm but lacks documentation. The logic generates k-1 separation points to divide n items into k groups, which is correct but not immediately obvious.Add a comment to clarify the function's purpose:
+// randomPickKfromN generates k-1 separation points to divide n items into k groups. +// It returns the indices where to split, excluding 0 and n. func randomPickKfromN(n, k int, rng *rand.Rand) []int {
52-53
: Remove nolint directive or specify the exact linter.The
//nolint:all
directive is overly broad and suppresses all linter warnings. If there's a specific linter warning you want to suppress, specify it explicitly.Either remove the directive if it's not needed, or specify the exact linter:
- //nolint:all + //nolint:gosec // G404: Use of weak random generator is acceptable for test data
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
Cargo.lock
is excluded by!**/*.lock
📒 Files selected for processing (6)
Cargo.toml
(2 hunks)crates/libzkp/Cargo.toml
(1 hunks)crates/prover-bin/src/prover.rs
(1 hunks)rollup/tests/integration_tool/imports.go
(1 hunks)tests/prover-e2e/prepare/dump_block_records.sql
(1 hunks)zkvm-prover/download-release.sh
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- zkvm-prover/download-release.sh
🚧 Files skipped from review as they are similar to previous changes (1)
- Cargo.toml
🧰 Additional context used
🧠 Learnings (1)
rollup/tests/integration_tool/imports.go (1)
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
🔇 Additional comments (5)
rollup/tests/integration_tool/imports.go (5)
121-153
: LGTM! Well-structured chunk import logic.The
importChunk
function properly handles database operations, error checking, and logging. The logic for fetching blocks, computing message queue hashes, and updating chunk metadata is correct.
155-199
: LGTM! Robust batch import implementation.The
importBatch
function correctly handles edge cases like missing previous batches, aggregates chunks appropriately, and maintains proper linkage between batches. The error handling and logging are well implemented.
201-215
: LGTM! Clean bundle import implementation.The
importBundle
function is straightforward and correctly handles the bundle creation and database updates with proper error handling.
45-48
: Globalcfg
is properly declared and initialized
Thecfg
variable is declared at line 22 inrollup/tests/integration_tool/main.go
and initialized at line 122, so it’s available for use inimports.go
. No further action needed.
143-146
: Resolved:codecCfg
initialization confirmed
- The global variable
codecCfg
is declared and initialized inrollup/tests/integration_tool/main.go
(line 23) asencoding.CodecV8
.
No further action is required.
last, err = batchOrm.GetLatestBatch(ctx) | ||
if err != nil && !strings.Contains(err.Error(), "record not found") { | ||
return nil, err | ||
} else if last != nil { | ||
log.Info("start from last batch", "index", last.Index) | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve error message specificity.
The error handling uses strings.Contains(err.Error(), "record not found")
which is fragile and depends on the exact error message format. Consider using error type checking or a more robust error handling approach.
Consider using GORM's error checking instead:
- if err != nil && !strings.Contains(err.Error(), "record not found") {
+ if err != nil && !errors.Is(err, gorm.ErrRecordNotFound) {
Don't forget to import the errors package:
import (
"context"
+ "errors"
"math/rand"
🤖 Prompt for AI Agents
In rollup/tests/integration_tool/imports.go around lines 160 to 166, the error
handling checks for "record not found" by inspecting the error message string,
which is fragile. Replace this with a robust error type check using GORM's
errors package, specifically by importing "errors" and using errors.Is(err,
gorm.ErrRecordNotFound) to detect the not found error. Adjust the condition to
handle errors accordingly without relying on string matching.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
coordinator/internal/logic/verifier/verifier.go (1)
153-161
: Validation logic is correct but could be optimized.The validation logic correctly prevents loading blocked verification keys. However, the current implementation has a minor inefficiency:
Consider optimizing the validation by:
- Pre-splitting the blocked_vks string into a slice or map for faster lookups
- Using a more structured approach for the blocked VKs
+var blockedVKs = map[string]bool{ + "rSJNNBpsxBdKlstbIIU/aYc7bHau98Qb2yjZMc5PmDhmGOolp5kYRbvF/VcWcO5HN5ujGs6S00W8pZcCoNQRLQ==": true, + "2Lo7Cebm6SFtcsYXipkcMxIBmVY7UpoMXik/Msm7t2nyvi9EaNGsSnDnaCurscYEF+IcdjPUtVtY9EcD7IKwWg==": true, + "D6YFHwTLZF/U2zpYJPQ3LwJZRm85yA5Vq2iFBqd3Mk4iwOUpS8sbOp3vg2+NDxhhKphgYpuUlykpdsoRhEt+cw==": true, +} - if strings.Contains(blocked_vks, dump.Chunk) { + if blockedVKs[dump.Chunk] {This would provide O(1) lookup time instead of O(n) string search, though the current approach is acceptable for the small number of blocked VKs.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
coordinator/internal/logic/verifier/verifier.go
(3 hunks)rollup/tests/integration_tool/imports.go
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- rollup/tests/integration_tool/imports.go
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: tests
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: check
- GitHub Check: tests
🔇 Additional comments (1)
coordinator/internal/logic/verifier/verifier.go (1)
8-8
: Import additions look good.The new imports for
fmt
andstrings
are necessary to support the validation logic added below.Also applies to: 13-13
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
zkvm-prover/Makefile (1)
59-61
: Auto-download regression – identical to previously flagged issue
The rule still hard-fails and forces manual artifact download, breaking
automation pipelines. See prior review for a drop-in fix that keeps the hard
failure only when the scripted download fails.
🧹 Nitpick comments (1)
zkvm-prover/print_plonky3gpu_version.sh (1)
7-11
: Quote the variable in the-z
test & silencepushd
noise
[ -z $plonky3_gpu_path ]
breaks when the path contains spaces and triggers unbound-var errors underset -u
.
Additionally, relying onpushd
’s stdout makes downstream parsing (sed -n '2p'
) fragile.-if [ -z $plonky3_gpu_path ]; then +if [ -z "$plonky3_gpu_path" ]; then ... - pushd $plonky3_gpu_path + pushd "$plonky3_gpu_path" > /dev/null ... - popd + popd > /dev/nullWith
pushd/popd
silenced the script prints one line, allowing
print_plonky3gpu_version.sh
callers to drop the brittlesed -n '2p'
.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
Cargo.lock
is excluded by!**/*.lock
crates/gpu_override/Cargo.lock
is excluded by!**/*.lock
📒 Files selected for processing (3)
zkvm-prover/Makefile
(4 hunks)zkvm-prover/print_plonky3gpu_version.sh
(1 hunks)zkvm-prover/release-verifier-stuff.sh
(1 hunks)
🧰 Additional context used
🪛 Shellcheck (0.10.0)
zkvm-prover/release-verifier-stuff.sh
[warning] 4-4: Quote this to prevent word splitting.
(SC2046)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: check
- GitHub Check: tests
- GitHub Check: tests
🔇 Additional comments (1)
zkvm-prover/Makefile (1)
25-25
:sed -n '2p'
brittle after script cleanup
Oncepushd/popd
output is suppressed (see previous comment), the commit hash
will be on line 1, breaking this extraction.-PLONKY3_GPU_VERSION=$(shell ./print_plonky3gpu_version.sh | sed -n '2p') +PLONKY3_GPU_VERSION=$(shell ./print_plonky3gpu_version.sh)Please adjust after updating the helper script.
SCROLL_ZKVM_STUFFDIR ?= `realpath .work` | ||
SCROLL_ZKVM_VERSION ?= 0.5.0rc1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
?=
is not valid Bash syntax – script will abort immediately
The ?=
operator belongs to Makefiles, not POSIX shells. When this file is executed the two lines are parsed as commands named SCROLL_ZKVM_STUFFDIR
/ SCROLL_ZKVM_VERSION
, producing a command not found
error and stopping the upload workflow.
-SCROLL_ZKVM_STUFFDIR ?= `realpath .work`
-SCROLL_ZKVM_VERSION ?= 0.5.0rc1
+# default vars – callers can still override in env
+: "${SCROLL_ZKVM_STUFFDIR:=$(realpath .work)}"
+: "${SCROLL_ZKVM_VERSION:=0.5.0rc1}"
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
SCROLL_ZKVM_STUFFDIR ?= `realpath .work` | |
SCROLL_ZKVM_VERSION ?= 0.5.0rc1 | |
# default vars – callers can still override in env | |
: "${SCROLL_ZKVM_STUFFDIR:=$(realpath .work)}" | |
: "${SCROLL_ZKVM_VERSION:=0.5.0rc1}" |
🧰 Tools
🪛 Shellcheck (0.10.0)
[warning] 4-4: Quote this to prevent word splitting.
(SC2046)
🤖 Prompt for AI Agents
In zkvm-prover/release-verifier-stuff.sh at lines 4-5, the use of '?=' for
variable assignment is invalid in Bash and causes the script to fail. Replace
'?=' with a proper Bash conditional assignment pattern, such as using parameter
expansion with default values or an if statement to assign the variables only if
they are not already set. Also, replace backticks with $(...) for command
substitution to improve readability and compatibility.
for stuff_file in "${STUFF_FILES[@]}"; do | ||
SRC="${SCROLL_ZKVM_STUFFDIR}/${stuff_file}" | ||
TARGET="${DIR_OUTPUT}/${stuff_file}" | ||
aws --profile default s3 cp $SRC s3://circuit-release/scroll-zkvm/$TARGET | ||
done |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Missing safety flags / quoting may lead to silent data-loss
Unquoted variables and lack of set -euo pipefail
can cause:
- word-splitting/globbing if
$SRC
or$TARGET
ever contains spaces - silent continuation when
aws cp
fails (e.g. missing file, wrong creds)
-#!/bin/bash
+#!/bin/bash
+set -euo pipefail
...
- aws --profile default s3 cp $SRC s3://circuit-release/scroll-zkvm/$TARGET
+ if [[ ! -f "$SRC" ]]; then
+ echo "❌ missing artifact: $SRC" >&2
+ exit 1
+ fi
+ aws --profile default s3 cp "$SRC" "s3://circuit-release/scroll-zkvm/$TARGET"
This also resolves shellcheck SC2046.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
for stuff_file in "${STUFF_FILES[@]}"; do | |
SRC="${SCROLL_ZKVM_STUFFDIR}/${stuff_file}" | |
TARGET="${DIR_OUTPUT}/${stuff_file}" | |
aws --profile default s3 cp $SRC s3://circuit-release/scroll-zkvm/$TARGET | |
done | |
#!/bin/bash | |
set -euo pipefail | |
for stuff_file in "${STUFF_FILES[@]}"; do | |
SRC="${SCROLL_ZKVM_STUFFDIR}/${stuff_file}" | |
TARGET="${DIR_OUTPUT}/${stuff_file}" | |
if [[ ! -f "$SRC" ]]; then | |
echo "❌ missing artifact: $SRC" >&2 | |
exit 1 | |
fi | |
aws --profile default s3 cp "$SRC" "s3://circuit-release/scroll-zkvm/$TARGET" | |
done |
🤖 Prompt for AI Agents
In zkvm-prover/release-verifier-stuff.sh around lines 10 to 14, the script lacks
safety flags and proper quoting, which can cause word splitting or globbing
issues and silent failures during the aws copy command. Add `set -euo pipefail`
at the top of the script to stop on errors and undefined variables, and quote
all variable expansions like "$SRC" and "$TARGET" in the aws command to prevent
word splitting and globbing problems.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🔭 Outside diff range comments (1)
crates/gpu_override/print_plonky3gpu_version.sh (1)
3-12
: Harden the script – missing safety flags, no config-file check, unquoted vars, noisypushd
The current form will:
grep
a non-existent.cargo/config.toml
and silently fall back to an empty string;- evaluate
[ -z $plonky3_gpu_path ]
without quoting – word-splitting/globbing risk;- leak
pushd/popd
directory traces to stdout, forcing downstream callers tosed
them away;- slice a
%h
commit hash that is already 7 chars – redundant.-#!/bin/bash +#!/usr/bin/env bash +set -euo pipefail + +config_file=".cargo/config.toml" +# Fail fast when config file is missing +[[ -f "$config_file" ]] || { echo "❌ $config_file not found" >&2; exit 1; } -config_file=.cargo/config.toml -plonky3_gpu_path=$(grep 'path.*plonky3-gpu' "$config_file" | cut -d'"' -f2 | head -n 1) -plonky3_gpu_path=$(dirname "$plonky3_gpu_path") +plonky3_gpu_path="$(grep -m1 'path.*plonky3-gpu' "$config_file" | cut -d'"' -f2 | xargs dirname)" -if [ -z $plonky3_gpu_path ]; then +if [[ -z "$plonky3_gpu_path" ]]; then exit 0 else - pushd $plonky3_gpu_path - commit_hash=$(git log --pretty=format:%h -n 1) - echo "${commit_hash:0:7}" - popd + pushd "$plonky3_gpu_path" > /dev/null + git log -n1 --pretty=format:%h + popd > /dev/null fiThis removes the need for downstream
sed -n '2p'
, adds safety, and produces a clean one-line hash.
♻️ Duplicate comments (1)
zkvm-prover/Makefile (1)
56-58
: Unconditionalexit 1
still blocks automated builds – previous feedback unresolved
The rule keeps forcing manual downloads, breaking CI reproducibility exactly as noted in earlier reviews.
🧹 Nitpick comments (1)
crates/gpu_override/Makefile (1)
1-1
: Declare standard phony targets to silencecheckmake
warnings
all
andtest
are required by the house style; add them to.PHONY
to avoid the linter noise.-.PHONY: build update clean +.PHONY: all build update clean test + +all: build +test: + @echo "no tests for gpu_override yet"
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
crates/gpu_override/Makefile
(1 hunks)crates/gpu_override/print_plonky3gpu_version.sh
(1 hunks)zkvm-prover/Makefile
(4 hunks)
🧰 Additional context used
🪛 checkmake (0.2.2)
crates/gpu_override/Makefile
[warning] 1-1: Missing required phony target "all"
(minphony)
[warning] 1-1: Missing required phony target "test"
(minphony)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: tests
- GitHub Check: tests
- GitHub Check: check
🔇 Additional comments (1)
zkvm-prover/Makefile (1)
63-65
: VerifyE2E_HANDLE_SET
path validity before running heavy prover task
../tests/prover-e2e/testset.json
is assumed to exist; missing file will trigger a long-running build just to fail at runtime. Consider an early guard:test_e2e_run: $(CIRCUIT_STUFF) @[ -f $(E2E_HANDLE_SET) ] || { echo "❌ $(E2E_HANDLE_SET) not found"; exit 1; } GO_TAG=...
crates/gpu_override/Makefile
Outdated
PLONKY3_GPU_VERSION=$(shell ./print_plonky3gpu_version.sh | sed -n '2p') | ||
$(info PLONKY3_GPU_VERSION is ${PLONKY3_GPU_VERSION}) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Stop depending on sed -n '2p'
– make the helper emit only the hash
After the script refactor above, print_plonky3gpu_version.sh
will already output a single line.
Simplify and avoid a useless subshell:
-PLONKY3_GPU_VERSION=$(shell ./print_plonky3gpu_version.sh | sed -n '2p')
+PLONKY3_GPU_VERSION := $(shell ./print_plonky3gpu_version.sh)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
PLONKY3_GPU_VERSION=$(shell ./print_plonky3gpu_version.sh | sed -n '2p') | |
$(info PLONKY3_GPU_VERSION is ${PLONKY3_GPU_VERSION}) | |
PLONKY3_GPU_VERSION := $(shell ./print_plonky3gpu_version.sh) | |
$(info PLONKY3_GPU_VERSION is ${PLONKY3_GPU_VERSION}) |
🤖 Prompt for AI Agents
In crates/gpu_override/Makefile around lines 4 to 6, the Makefile currently uses
a subshell with `sed -n '2p'` to extract the second line from the output of
`print_plonky3gpu_version.sh`. Since the script now outputs only the needed hash
line, remove the `sed` command and subshell, assigning the output of
`print_plonky3gpu_version.sh` directly to PLONKY3_GPU_VERSION to simplify the
code.
fffc315
to
41caf08
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
tests/prover-e2e/Makefile (1)
37-39
:go build
will fail if the output directory does not existCreate
build/bin
first and drop the$(PWD)
prefix sogo
treats the path
as module-relative.-test_tool: - go build -o $(PWD)/build/bin/e2e_tool ../../rollup/tests/integration_tool +test_tool: + mkdir -p build/bin + go build -o build/bin/e2e_tool ../../rollup/tests/integration_tool
🧹 Nitpick comments (4)
tests/prover-e2e/docker-compose.yml (2)
6-7
: Access-URL comment is inconsistent with actual DB nameThe comment advertises
…/devdb
whilePOSTGRES_DB
is set toscroll
, which will confuse anyone copy-pasting the URI.-# The access url is postgresql://dev:dev@localhost:5432/devdb +# The access url is postgresql://dev:dev@localhost:5432/scroll
20-22
: Missing trailing newline + commented-out volumeYAML-lint flags the missing newline at EOF; add it to keep linters green.
Also consider uncommenting the volume so test data survives container restarts.tests/prover-e2e/.env (1)
1-3
: Key order nit & duplicated credentialsdotenv-linter complains about key order; re-ordering costs nothing and keeps
CI quiet. Same credential caveat as inconfig.json
.tests/prover-e2e/Makefile (1)
1-1
: Declare every phony target
import_data
andimport_data_euclid
are missing from the.PHONY
list.-.PHONY: clean setup_db test_tool all check_vars +.PHONY: clean setup_db test_tool all check_vars import_data import_data_euclid
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
rollup/tests/integration_tool/imports.go
(1 hunks)rollup/tests/integration_tool/main.go
(1 hunks)tests/prover-e2e/.env
(1 hunks)tests/prover-e2e/.gitignore
(1 hunks)tests/prover-e2e/Makefile
(1 hunks)tests/prover-e2e/config.json
(1 hunks)tests/prover-e2e/docker-compose.yml
(1 hunks)tests/prover-e2e/prepare/dump_block_records.sql
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (4)
- tests/prover-e2e/.gitignore
- tests/prover-e2e/prepare/dump_block_records.sql
- rollup/tests/integration_tool/main.go
- rollup/tests/integration_tool/imports.go
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: colinlyguo
PR: scroll-tech/scroll#1645
File: rollup/proposer-tool-config.json:34-40
Timestamp: 2025-04-15T08:52:44.176Z
Learning: In configuration files like `rollup/proposer-tool-config.json`, placeholders such as `<mainnet read db config>` are intentionally left as-is to be replaced by users with their own configuration values when deploying the tool.
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
📚 Learning: in the `tests/integration-test` folder, updates to import statements or dependencies are not necessa...
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
Applied to files:
tests/prover-e2e/.env
tests/prover-e2e/config.json
tests/prover-e2e/docker-compose.yml
📚 Learning: in configuration files like `rollup/proposer-tool-config.json`, placeholders such as `
Learnt from: colinlyguo
PR: scroll-tech/scroll#1645
File: rollup/proposer-tool-config.json:34-40
Timestamp: 2025-04-15T08:52:44.176Z
Learning: In configuration files like `rollup/proposer-tool-config.json`, placeholders such as `<mainnet read db config>` are intentionally left as-is to be replaced by users with their own configuration values when deploying the tool.
Applied to files:
tests/prover-e2e/.env
tests/prover-e2e/config.json
Learnt from: colinlyguo
PR: scroll-tech/scroll#1645
File: rollup/proposer-tool-config.json:34-40
Timestamp: 2025-04-15T08:52:44.176Z
Learning: In configuration files like `rollup/proposer-tool-config.json`, placeholders such as `<mainnet read db config>` are intentionally left as-is to be replaced by users with their own configuration values when deploying the tool.
tests/prover-e2e/.env
tests/prover-e2e/config.json
🪛 dotenv-linter (3.3.0)
tests/prover-e2e/.env
[warning] 2-2: [UnorderedKey] The GOOSE_DBSTRING key should go before the GOOSE_DRIVER key
🪛 checkmake (0.2.2)
tests/prover-e2e/Makefile
[warning] 1-1: Missing required phony target "test"
(minphony)
🪛 Checkov (3.2.334)
tests/prover-e2e/config.json
[MEDIUM] 4-5: Basic Auth Credentials
(CKV_SECRET_4)
tests/prover-e2e/docker-compose.yml
[MEDIUM] 6-7: Basic Auth Credentials
(CKV_SECRET_4)
🪛 YAMLlint (1.37.1)
tests/prover-e2e/docker-compose.yml
[error] 22-22: no new line character at the end of file
(new-line-at-end-of-file)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: tests
- GitHub Check: tests
- GitHub Check: check
🔇 Additional comments (1)
tests/prover-e2e/config.json (1)
1-8
: Credentials are committed to VCS – ensure they stay dev-only
dev:dev
is fine for local tests, but make sure this file never gets used in
CI or production. A safer pattern is to read the DSN from the.env
file or
env-vars and keep this config template credential-free.
|
||
services: | ||
postgres: | ||
image: postgres |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Pin the Postgres image to a deterministic version
Leaving image: postgres
unpinned causes test runs to break when the upstream
tag moves. Use an explicit, lightweight tag.
- image: postgres
+ image: postgres:16-alpine
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
image: postgres | |
image: postgres:16-alpine |
🤖 Prompt for AI Agents
In tests/prover-e2e/docker-compose.yml at line 12, the Postgres image is not
pinned to a specific version, which can cause test instability when the upstream
image updates. Replace the generic 'postgres' image tag with a specific,
lightweight version tag such as 'postgres:14-alpine' to ensure consistent and
reproducible test environments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
tests/prover-e2e/prepare/dump_block_records.sql (1)
2-3
: Comment still misleading; update or change thetransactions
handling
The comment states that thetransactions
column will be replaced with an empty record, but the query keepsquote_literal(transactions)
unchanged. The discrepancy surfaced in earlier reviews and is still unresolved.Also applies to: 22-22
🧹 Nitpick comments (1)
tests/prover-e2e/prepare/dump_block_records.sql (1)
7-23
: ConsiderCOPY … TO/ FROM
for faster, type-safe dumps
Building oneINSERT
statement per row via string concatenation is slow for large ranges and loses type fidelity (everything becomes text). ACOPY (SELECT …) TO 'file'
orpg_dump --data-only --table=l2_block --column-inserts
would be more efficient and keeps native formats.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
tests/prover-e2e/00100_import_blocks.sql
(1 hunks)tests/prover-e2e/prepare/dump_block_records.sql
(1 hunks)zkvm-prover/Makefile
(1 hunks)zkvm-prover/download-release.sh
(2 hunks)
✅ Files skipped from review due to trivial changes (3)
- zkvm-prover/Makefile
- zkvm-prover/download-release.sh
- tests/prover-e2e/00100_import_blocks.sql
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: colinlyguo
PR: scroll-tech/scroll#1645
File: rollup/proposer-tool-config.json:34-40
Timestamp: 2025-04-15T08:52:44.176Z
Learning: In configuration files like `rollup/proposer-tool-config.json`, placeholders such as `<mainnet read db config>` are intentionally left as-is to be replaced by users with their own configuration values when deploying the tool.
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: check
- GitHub Check: tests
- GitHub Check: tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (3)
tests/prover-e2e/README.md (3)
1-1
: Trim trailing punctuation & tighten heading wordingA Markdown linter (MD026) flags the full-stop; additionally, “setup” is a noun. Switching to an imperative title makes the intent clearer.
-## A new e2e test tool to setup a local environment for testing coordinator and prover. +## End-to-end test tool for setting up a local coordinator + prover environment
3-3
: Fix grammar and clarify purpose sentenceMultiple singular/plural mismatches and missing articles make the intro hard to parse.
-It contain data from some blocks in scroll sepolia, and help to generate a series of chunks/batches/bundle from this blocks, filling the db for coordinator, so a e2e test (from chunk to bundle) can be run completely local +It contains data from several Scroll Sepolia blocks and helps generate a series of chunks, batches, and bundles from these blocks, pre-populating the coordinator database so an end-to-end test (from chunk to bundle) can run entirely locally.
5-12
: Normalize list style, capitalisation, and product namesConsider a quick polish pass: start each step with a capital verb, use correct product casing (PostgreSQL, Docker), and end sentences with a period for consistency.
-1. run `make all` under `tests/prover-e2e`, it would launch a postgreSql db in local docker container, which is ready to be used by coordinator (include some chunks/batches/bundle waiting to be proven) +1. Run `make all` in `tests/prover-e2e`. This launches a PostgreSQL database in a local Docker container and pre-loads it with chunks, batches, and bundles ready for the coordinator to consume. -2. download circuit assets with `download-release.sh` script in `zkvm-prover` +2. Download circuit assets with the `download-release.sh` script in `zkvm-prover`. -3. generate the verifier stuff corresponding to the downloaded assets by `make gen_verifier_stuff` in `zkvm-prover` +3. Generate the verifier artifacts for those assets via `make gen_verifier_stuff` in `zkvm-prover`. -4. setup `config.json` and `genesis.json` for coordinator, copy the generated verifier stuff in step 3 to the directory which coordinator would load them +4. Set up `config.json` and `genesis.json` for the coordinator, then copy the verifier artifacts from step 3 into the directory the coordinator reads from. -5. build and launch `coordinator_api` service locally +5. Build and launch the `coordinator_api` service locally. -6. setup the `config.json` for zkvm prover to connect with the locally launched coordinator api +6. Configure `zkvm-prover/config.json` so the prover connects to the local coordinator API. -7. in `zkvm-prover`, launch `make test_e2e_run`, which would specific prover run locally, connect to the local coordinator api service according to the `config.json`, and prove all tasks being injected to db in step 1. +7. In `zkvm-prover`, run `make test_e2e_run`. The prover will start locally, connect to the coordinator API as configured, and prove all tasks injected into the database in step 1.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
tests/prover-e2e/README.md
(1 hunks)
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: colinlyguo
PR: scroll-tech/scroll#1645
File: rollup/proposer-tool-config.json:34-40
Timestamp: 2025-04-15T08:52:44.176Z
Learning: In configuration files like `rollup/proposer-tool-config.json`, placeholders such as `<mainnet read db config>` are intentionally left as-is to be replaced by users with their own configuration values when deploying the tool.
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
📚 Learning: in the `tests/integration-test` folder, updates to import statements or dependencies are not necessa...
Learnt from: colinlyguo
PR: scroll-tech/scroll#1530
File: tests/integration-test/go.mod:21-21
Timestamp: 2024-10-20T15:30:18.084Z
Learning: In the `tests/integration-test` folder, updates to import statements or dependencies are not necessary since it's only used for testing purposes.
Applied to files:
tests/prover-e2e/README.md
🪛 markdownlint-cli2 (0.17.2)
tests/prover-e2e/README.md
1-1: Trailing punctuation in heading
Punctuation: '.'
(MD026, no-trailing-punctuation)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: tests
- GitHub Check: check
- GitHub Check: tests
This PR induce a new e2e test tool to setup a local environment for testing coordinator and prover.
It contain data from some blocks in scroll sepolia, and help to generate a series of chunks/batches/bundle from this blocks, filling the db for coordinator, so a e2e test (from chunk to bundle) can be run completely local
Steps:
make all
undertests/prover-e2e
, it would launch a postgreSql db in local docker container, which is ready to be used by coordinator (include some chunks/batches/bundle waiting to be proven)download-release.sh
script inzkvm-prover
make gen_verifier_stuff
inzkvm-prover
config.json
andgenesis.json
for coordinator, copy the generated verifier stuff in step 3 to the directory which coordinator would load themcoordinator_api
service locallyconfig.json
for zkvm prover to connect with the locally launched coordinator apizkvm-prover
, launchmake test_e2e_run
, which would specific prover run locally, connect to the local coordinator api service according to theconfig.json
, and prove all tasks being injected to db in step 1.Summary by CodeRabbit
New Features
Chores
.gitignore
to exclude build outputs and test data from version control.Bug Fixes