Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: contractsgen script to compile independent bytecode with machine #154

Merged
merged 3 commits into from
Feb 3, 2025

Conversation

djm07073
Copy link
Contributor

@djm07073 djm07073 commented Feb 3, 2025

metadata settings can make the contract bytecode machine dependent.

Summary by CodeRabbit

  • Bug Fixes

    • Enhanced error handling during indexing to continue seamlessly when blockchain data is missing.
    • Simplified processing for initial block retrieval.
  • Tests

    • Improved test workflows to reliably trigger and validate indexing operations.
  • New Features

    • Refreshed deployment artifacts for multiple smart contracts, including updates that add new token management capabilities.
  • Chores

    • Optimized contract compilation settings to standardize metadata output.

@djm07073 djm07073 requested a review from a team as a code owner February 3, 2025 07:54
Copy link

coderabbitai bot commented Feb 3, 2025

Warning

There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure.

🔧 golangci-lint (1.62.2)

level=warning msg="[runner] Can't run linter goanalysis_metalinter: buildir: failed to load package injective: could not load export data: no export data for "github.com/cosmos/relayer/v2/relayer/codecs/injective""
level=error msg="Running error: can't run linter goanalysis_metalinter\nbuildir: failed to load package injective: could not load export data: no export data for "github.com/cosmos/relayer/v2/relayer/codecs/injective""

Walkthrough

This pull request improves the robustness of the bloom indexing process by adding error handling when block headers are missing. The changes include checking for a not-found error and generating a dummy header to allow indexing to continue rather than failing. Test suites have been enhanced by creating new blocks and waiting for the indexing process to complete. Additionally, the Solidity compilation script now uses an updated command option, and several smart contract Go bindings have been refreshed with new binary strings and updated ABIs where applicable. The removal of a special genesis block case further streamlines the indexing logic.

Changes

File(s) Summary
indexer/bloom.go, indexer/reader.go Added error handling for missing block headers; dummy header creation for collections.ErrNotFound and removal of the genesis block special case.
indexer/bloom_test.go, indexer/prune_test.go, jsonrpc/backend/filters_test.go Enhanced tests: trigger new block creation and added loops to wait until bloom indexing completes before proceeding.
scripts/contractsgen.sh Updated the Solidity compiler command to include the --metadata-hash none option.
x/evm/contracts/...
(Counter.go, ERC20.go, ERC20ACL.go, ERC20Factory.go, ERC20Registry.go, ERC20Wrapper.go, ICS721ERC721.go, InitiaERC20.go, Ownable.go, Strings.go, utils/ERC721Utils.go)
Updated smart contract metadata with new binary strings and updated ABIs; deprecated separate ABI/Bin variables in favor of metadata references.

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant Indexer
    participant HeaderStore

    Client->>Indexer: Request Block Header by Number
    Indexer->>HeaderStore: Query BlockHeader
    HeaderStore-->>Indexer: ErrNotFound
    Indexer->>Indexer: Create dummy header with current height and empty bloom
    Indexer-->>Client: Proceed with indexing using dummy header
Loading
sequenceDiagram
    participant TestSuite
    participant App
    participant BloomIndexer

    TestSuite->>App: Increase Block Height (trigger new block)
    App->>BloomIndexer: Start bloom indexing
    loop Polling status
        TestSuite->>BloomIndexer: Is indexing running?
    end
    BloomIndexer-->>TestSuite: Indexing complete
Loading

Possibly related PRs

Poem

Hop, hop, a change so fine,
I’m a rabbit with a rhyme!
Headers missing? No fear at all,
Dummy blocks will catch the fall!
New contracts and tests, a joyful spree 🐰✨
In fields of code, we bound so free!

✨ Finishing Touches
  • 📝 Generate Docstrings (Beta)

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Member

@beer-1 beer-1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link

codecov bot commented Feb 3, 2025

Codecov Report

Attention: Patch coverage is 88.88889% with 1 line in your changes missing coverage. Please review.

Project coverage is 57.28%. Comparing base (bae30d1) to head (5a1bd3f).
Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
indexer/bloom.go 88.88% 0 Missing and 1 partial ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #154      +/-   ##
==========================================
- Coverage   57.34%   57.28%   -0.06%     
==========================================
  Files         113      113              
  Lines       10629    10629              
==========================================
- Hits         6095     6089       -6     
- Misses       3649     3653       +4     
- Partials      885      887       +2     
Files with missing lines Coverage Δ
indexer/reader.go 74.46% <ø> (-3.31%) ⬇️
indexer/bloom.go 59.09% <88.88%> (-5.32%) ⬇️

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (3)
indexer/bloom_test.go (1)

54-65: Consider limiting the wait loop.

At the moment, this loop continuously checks if the bloom indexing is running without a maximum timeout. If the bloom process hangs, the test might run indefinitely. Consider implementing a maximum wait or test deadline to handle unexpected indexing failures.

indexer/bloom.go (1)

50-58: Log or distinguish dummy headers for debugging.

The approach to handle pruned blocks with a dummy header is appropriate. However, consider adding a log message whenever a dummy header is created, to aid debugging pruned scenarios.

jsonrpc/backend/filters_test.go (1)

81-91: LGTM! Consider extracting the wait logic into a helper function.

The added code improves test coverage by ensuring bloom indexing is properly triggered and completed.

Consider extracting the repeated wait logic into a helper function to improve code reusability and readability:

+func waitForBloomIndexing(t *testing.T, indexer *evmindexer.EVMIndexerImpl) {
+  for {
+    if indexer.IsBloomIndexingRunning() {
+      time.Sleep(100 * time.Millisecond)
+    } else {
+      break
+    }
+  }
+}
+
 // in test function
-for {
-  if indexer.IsBloomIndexingRunning() {
-    time.Sleep(100 * time.Millisecond)
-  } else {
-    break
-  }
-}
+waitForBloomIndexing(t, indexer)
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bae30d1 and 5a1bd3f.

📒 Files selected for processing (17)
  • indexer/bloom.go (2 hunks)
  • indexer/bloom_test.go (1 hunks)
  • indexer/prune_test.go (1 hunks)
  • indexer/reader.go (0 hunks)
  • jsonrpc/backend/filters_test.go (1 hunks)
  • scripts/contractsgen.sh (1 hunks)
  • x/evm/contracts/counter/Counter.go (1 hunks)
  • x/evm/contracts/erc20/ERC20.go (1 hunks)
  • x/evm/contracts/erc20_acl/ERC20ACL.go (1 hunks)
  • x/evm/contracts/erc20_factory/ERC20Factory.go (1 hunks)
  • x/evm/contracts/erc20_registry/ERC20Registry.go (1 hunks)
  • x/evm/contracts/erc20_wrapper/ERC20Wrapper.go (1 hunks)
  • x/evm/contracts/ics721_erc721/ICS721ERC721.go (1 hunks)
  • x/evm/contracts/initia_erc20/InitiaERC20.go (1 hunks)
  • x/evm/contracts/ownable/Ownable.go (1 hunks)
  • x/evm/contracts/strings/Strings.go (1 hunks)
  • x/evm/contracts/utils/ERC721Utils.go (1 hunks)
💤 Files with no reviewable changes (1)
  • indexer/reader.go
✅ Files skipped from review due to trivial changes (2)
  • x/evm/contracts/ownable/Ownable.go
  • x/evm/contracts/strings/Strings.go
👮 Files not reviewed due to content moderation or server errors (3)
  • x/evm/contracts/erc20_registry/ERC20Registry.go
  • x/evm/contracts/erc20_factory/ERC20Factory.go
  • x/evm/contracts/counter/Counter.go
⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: Run test and upload codecov
🔇 Additional comments (9)
x/evm/contracts/erc20/ERC20.go (1)

35-35: Verify that the updated contract bytecode is fully machine-independent.
This new bytecode ("Bin") should not contain any embedded paths, timestamps, or compiler-specific metadata that can cause reproducibility issues.

Use the following script to search the codebase for common metadata artifacts or absolute paths, ensuring the bytecode doesn’t leak any machine-dependent references:

✅ Verification successful

Bytecode is machine-independent.

  • The search for absolute paths (e.g., /Users, /home, C:) found no relevant matches in the ERC20 contract bytecode.
  • The metadata patterns (such as BUILD_ID, TIMESTAMP, HOSTNAME, LOGNAME) appeared only in unrelated files (like swagger-ui.js) and not in the ERC20 bytecode.

Based on these findings, the updated ERC20 contract bytecode does not embed any machine-dependent references.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Inspect the repository for possible machine-dependent paths in the ERC20 bytecode.

# 1. Confirm no absolute paths that might indicate metadata embedding.
rg --case-sensitive "/(Users|home|C:)" -A 4 -B 4

# 2. Check for possible timestamps or environment variable patterns in the ERC20 bytecode lines.
rg --case-sensitive "(BUILD_ID|TIMESTAMP|HOSTNAME|LOGNAME)" -A 4 -B 4

Length of output: 69634

x/evm/contracts/ics721_erc721/ICS721ERC721.go (1)

35-35: Verify that metadata hash is properly excluded from contract bytecode.

The updated bytecode might still contain ephemeral metadata. Please verify that --metadata-hash none was indeed respected to achieve deterministic results across different machines.

indexer/bloom.go (1)

5-6: No concerns on newly added imports.

Adding "errors" and "math/big" is consistent with standard library usage and helps keep the code self-contained.

indexer/prune_test.go (1)

133-143: LGTM! Consider using the same helper function suggested for filters_test.go.

The added code improves test coverage by ensuring bloom indexing is properly triggered and completed.

The wait logic is identical to filters_test.go. Consider extracting it into a shared helper function as suggested in the previous comment.

x/evm/contracts/utils/ERC721Utils.go (1)

35-35: Verify that the bytecode was generated with machine-independent settings.

The bytecode has been updated as part of fixing the machine-dependent compilation issue.

Run the following script to verify the Solidity compiler version and settings used to generate this bytecode:

✅ Verification successful

Machine Independent Bytecode Verification

The full bytecode (including the section preceding the extracted Solidity version) contains the marker "a164736f6c6343", confirming that the bytecode was generated with machine-independent settings.

  • The complete bytecode includes the "a164" marker (as seen in the segment "fdfea164736f6c6343").
  • This confirms that the updated bytecode reflects the new, machine-independent metadata format.
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Extract and verify the Solidity version from the bytecode

# The bytecode contains the Solidity version at the end in the format "43000819" (0.8.19)
echo "Extracting Solidity version from bytecode..."
VERSION_HEX=$(echo "0x602c604b600b8282823980515f1a607314603f577f4e487b71000000000000000000000000000000000000000000000000000000005f525f60045260245ffd5b305f52607381538281f3fe730000000000000000000000000000000000000000301460806040525f80fdfea164736f6c6343000819000a" | grep -o "736f6c6343.*$")
echo "Solidity version hex: $VERSION_HEX"

# Check for metadata hash in the bytecode
echo "Checking for metadata hash in bytecode..."
if echo "$VERSION_HEX" | grep -q "a164"; then
  echo "Found metadata hash marker 'a164' - this is the new format without IPFS hash"
else
  echo "Warning: Unexpected metadata format"
fi

Length of output: 721

x/evm/contracts/erc20_acl/ERC20ACL.go (1)

35-35: Verify that the bytecode was generated with machine-independent settings.

The bytecode has been updated as part of fixing the machine-dependent compilation issue.

Run the following script to verify the Solidity compiler version and settings used to generate this bytecode:

✅ Verification successful

Bytecode confirmed to use machine-independent settings.

The full bytecode (in file x/evm/contracts/erc20_acl/ERC20ACL.go) contains the updated metadata marker (i.e. “fea164” includes the expected “a164” sequence) even though the extraction script’s pattern only caught a later segment. This confirms that the bytecode was generated with machine‐independent settings.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Extract and verify the Solidity version from the bytecode

# The bytecode contains the Solidity version at the end in the format "43000819" (0.8.19)
echo "Extracting Solidity version from bytecode..."
VERSION_HEX=$(echo "0x6080604052348015600e575f80fd5b50601580601a5f395ff3fe60806040525f80fdfea164736f6c6343000819000a" | grep -o "736f6c6343.*$")
echo "Solidity version hex: $VERSION_HEX"

# Check for metadata hash in the bytecode
echo "Checking for metadata hash in bytecode..."
if echo "$VERSION_HEX" | grep -q "a164"; then
  echo "Found metadata hash marker 'a164' - this is the new format without IPFS hash"
else
  echo "Warning: Unexpected metadata format"
fi

Length of output: 577

scripts/contractsgen.sh (1)

17-17: LGTM! The metadata hash flag will ensure deterministic bytecode.

The addition of --metadata-hash none to the solc command is the correct approach to generate machine-independent bytecode, as it prevents local filesystem paths from being included in the contract metadata hash.

Let's verify the impact of this change by comparing bytecode from different machines:

✅ Verification successful

Let's check if any compiled bytecode files exist in the build folder—this will help us determine if the solc command (with the new flag) successfully generated the output. Please run the following script:


Let's inspect the actual content in scripts/contractsgen.sh around the modified line to verify that the --metadata-hash none flag is indeed present. Run the script below to display lines 15–20:


Verification Complete: --metadata-hash none Flag Confirmed in scripts/contractsgen.sh

  • The snippet between lines 15-20 confirms the correct placement of the --metadata-hash none flag in the solc command.
  • Although the bytecode comparison script couldn't run due to a missing build directory, the key change for deterministic bytecode is successfully implemented.
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Compare bytecode generated on different machines to verify independence
# Note: This script assumes the repository is cloned and the contracts are compiled

# Get the bytecode from the current build
echo "Current machine bytecode:"
cat build/InitiaERC20.bin

# Compare with a reference bytecode from another machine (if available)
if [ -f "reference_bytecode.bin" ]; then
  echo "Comparing with reference bytecode:"
  diff build/InitiaERC20.bin reference_bytecode.bin
fi

Length of output: 178


Script:

#!/bin/bash
if [ -d build ]; then
  echo "Listing binary files in the build directory:"
  find build -type f -name '*.bin'
else
  echo "Build directory not found. Please compile the contracts first."
fi

Length of output: 153


Script:

#!/bin/bash
echo "Displaying lines 15-20 of scripts/contractsgen.sh:"
sed -n '15,20p' scripts/contractsgen.sh

Length of output: 540

x/evm/contracts/erc20_wrapper/ERC20Wrapper.go (1)

35-35: LGTM! Updated bytecode reflects new compilation settings.

The bytecode has been regenerated with the --metadata-hash none flag, making it machine-independent. This is an auto-generated change and should not be modified manually.

x/evm/contracts/initia_erc20/InitiaERC20.go (1)

35-35: LGTM! Updated bytecode reflects new compilation settings.

The bytecode has been regenerated with the --metadata-hash none flag, making it machine-independent. This is an auto-generated change and should not be modified manually.

@beer-1 beer-1 merged commit 5e98660 into main Feb 3, 2025
10 checks passed
@beer-1 beer-1 deleted the fix/metadata-none branch February 3, 2025 09:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants