optimism: Downloaded Solidity by hardhat does not have `kall`, causing compilation errors

I’m experiencing this issue on PR 579 / Commit 35cb48d.

What I’m doing:

Locally:

  1. I check out my branch.
  2. It has new dependency (just OZ contract) so I run yarn && yarn build && in the root. It works.

In CI

I see two errors on the build step.

First:

lerna ERR! yarn run build stderr:
Error HH503: Couldn't download compiler version 0.7.6. Checksum verification failed. Please check your connection.
For more info go to https://hardhat.org/HH503 or run Hardhat with --show-stack-traces

Second:

contracts/optimistic-ethereum/libraries/wrappers/Lib_ExecutionManagerWrapper.sol:143:13: DeclarationError: Function not found.
            kall(add(_calldata, 0x20), mload(_calldata), 0x0, 0x0)
            ^--^=

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 40 (26 by maintainers)

Most upvoted comments

@smartcontracts and I spent a bunch more time narrowing in on the root of this issue and finally found it. It turns out it is related to how hardhat tasks behave on M1 docker–it is running a different task than the hardhat-ovm plugin is hooked into. So what’s really happening is the hardhat-ovm compiler plugin just isn’t being activated on M1s–it is really just hardhat running the vanilla compiler on the unsupported: evm contracts.

We only have a partial fix so far, but I can now run the system on my M1 if I serve the predeploys locally! Here’s a branch with the fix and required env change. If I yarn build:dump && yarn serve in the contracts then I can get things running with that.

Here was the diff that gets builds partially working. I exec’d into the deployer and it looks like there are some missing ovm-artifacts, backed up by this CI run. Getting close!

I got to reproduce it going to master and removing a few commits.

What’s happening is that when using an M1 Mac your docker images are arm-base, and there are no arm solc builds, so only solcjs is used.

To validate this I just exec’ed and printed uname -a from packages/contracts/hardhat.config.ts, which results in Linux buildkitsandbox 5.10.25-linuxkit #1 SMP PREEMPT Tue Mar 23 09:24:45 UTC 2021 aarch64 GNU/Linux

EDIT: scratch that, we got it working! @bscarano Would you be able to share the full logs here? 😃

Great news–and better yet, I was just able to replicate this on my own M1, so should be much easier to fix now that I can see it in front of me!

Confirmed that applying the changes from @ben-chain’s branch resolves the issue with docker-compose build and the OVM compiler. 🙏

The workaround of serving the contracts locally i’m seeing a mix of success/failure. I see many errors connecting from the container to 8081:

dtl_1                | curl: (7) Failed to connect to deployer port 8081: Connection refused
batch_submitter_1    | curl: (7) Failed to connect to deployer port 8081: Connection refused
l2geth_1             | curl: (7) Failed to connect to deployer port 8081: Connection refused
eth_chainId (3)      | curl: (7) Failed to connect to deployer port 8081: Connection refused

But I also see: l2geth_1 | INFO [05-04|11:32:06.734] Fetching state dump path=http://host.docker.internal:8081/state-dump.latest.json

succeed with a 200 to the local server.

General FYI to anyone working on this issue, our compiler caching logic within @eth-optimism/hardhat-ovm happens here: https://github.com/ethereum-optimism/optimism/blob/master/packages/hardhat-ovm/src/index.ts#L38-L105.

@bscarano if you get a chance, would you mind sharing if you are on an M1, and if --no-cache solves this for you? 😃

Yes I have an M1! But you already replicated the issue I see which is great.

Is this an M1 issue?

Hmmm, the compiler is absolutely updated by now, so this must be a caching issue where the old version is being held on to by docker. @TransmissionsDev have you tried docker-compose build --no-cache since #723 was merged?

Yep even did a full docker system prune -a and nuked the repo and recloned 😦

I have been looking into this for the past day and it’s really stumping me, really sorry for the interruption to service @TransmissionsDev ! To confirm – are you still getting this in the monorepo? If it’s only on your local dev environment, can you try putting this version into your hardhat.config.ts and tell me if it goes away?

Hm. Re-opening and renaming.

Latest master has the fix I believe. Let’s merge the versioning PR to get new packages out @snario