Skip to content
This repository was archived by the owner on Dec 5, 2021. It is now read-only.

Commit

Permalink
[pull] develop from ethereum-optimism:develop (#43)
Browse files Browse the repository at this point in the history
* chore: reduce hardhat timeout to 20 seconds (ethereum-optimism#968)

* fix: force LF line endings for scripts to avoid docker problems on Windows (ethereum-optimism#974)

* fix: use correct line endings for windows

* chore: add changeset

* feat: add hardhat deploy instructions to readme (ethereum-optimism#965)

* feat: add deployment instructions to readme

* Add changeset

* fix style

* Update README.md

* feat: fees v2 (ethereum-optimism#976)

* l2 geth: new fee logic

* l2 geth: migrate to fees package

* core-utils: new fee scheme

* chore: add changeset

* l2geth: delete dead code

* integration-tests: fix typo

* integration-tests: fixes

* fees: use fee scalar

* lint: fix

* rollup: correct gas payment comparison

* fix(integration-tests): do not hardcode gas price

* core-utils: update with new scheme

* l2geth: refactor rollup oracle

* l2geth: clean up DoEstimateGas

* l2geth: implement latest scheme

* tests: fix up

* lint: fix

* l2geth: better sycn service test

* optimism: rename to TxGasLimit

* fee: fix docstring

* tests: fix

* variables: rename

* l2geth: prevent users from sending txs with too high of a fee

* integration-tests: fix import

* integration-tests: fix type

* integration-tests: fix gas limits

* lint: fix

* l2geth: log error

Co-authored-by: Georgios Konstantopoulos <[email protected]>

* Add static analysis action (ethereum-optimism#848)

* Add static analysis github action
setup python and install slither

* Add nvmrc file for setting node to v14.17

* Update slither command run to link missing contract packages from monorepo root

* Add steps for installing dependencies

* Add yarn build step to github action

* Enable colour in github action for static analysis

* Disable certain detectors

* Ensure slither does not fail build

* Add instructions on running static analysis to monorepo readme

* build(deps): bump ws from 7.4.4 to 7.4.6 in /ops/docker/hardhat (ethereum-optimism#987)

Bumps [ws](https://github.com/websockets/ws) from 7.4.4 to 7.4.6.
- [Release notes](https://github.com/websockets/ws/releases)
- [Commits](websockets/ws@7.4.4...7.4.6)

Signed-off-by: dependabot[bot] <[email protected]>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* feat[message-relayer]: relay tx generator (ethereum-optimism#952)

* feat[message-relayer]: relay tx generator

* whoops, I burned our infura key

* fix minor bug

* add comments

* add more comments and clean stuff up

* add empty test descriptions

* add tests

* move smock to dev deps

* chore: add changeset

* minor cleanup to merkle tree proof function

* use bignumber math to avoid nested await

* use a better interface

* minor fixes and simplifications

* backwards compatible dtl syncing (ethereum-optimism#986)

* kovan: fix attempt

* kovan: db fix

* kovan: types are strings from db

* l2geth: parse things as strings

* chore: add changeset

* dtl: also stringify the range query

* geth: dereference

* geth: assign err

* dtl: handle null

* dtl: fix unit tests

* fix[smock]: fix broken call assertions for overloaded functions  (ethereum-optimism#996)

* fix[smock]: fix broken call assertions for overloaded functions

* chore: add changeset

* minor correction and add a test

* add a test for non-overloaded functions

* fix[message-relayer]: remove spreadsheet mode (ethereum-optimism#998)

* fix[message-relayer]: remove spreadsheet mode

* chore: add changeset

* Lower local rollup timestamp refresh (ethereum-optimism#985)

* update rollup timestamp refresh

* increase refresh time to 5s

* feat: fees v3 (ethereum-optimism#999)

* core-utils: fee impl v3

* l2geth: fees v3 impl

* integration-tests: update for fees v3

* chore: add changeset

* fix: typo

* integration-tests: fix and generalize

* fees: update fee scalar

* l2geth: check gas in the mempool behind usingovm

* tests: fix up

* l2geth: remove dead var

* truffle: fix config

* fix: remove dead coders (ethereum-optimism#1001)

* chore: delete dead coders

* chore: add changeset

* dtl: remove dead imports

* core-utils: delete dead tests

* batch-submitter: remove txtype

* chore: add changeset

* docs[message-relayer]: add a README and improve the interface for generating proofs (ethereum-optimism#1002)

* docs[message-relayer]: add basic docs and clean up an interface

* chore: add changeset

* dtl: log error stack for failed http request (ethereum-optimism#995)

* dtl: log error stack for failed http request

* chore: add changeset

* Add rpc-proxy service for whitelisting JSON RPC methods to the sequencer. (ethereum-optimism#945)

* Add healthcheck endpoint for rpc-proxy
Added ethereum-nginx-proxy source
updated README and docker image build

* Check ETH_CALLS_ALLOWED is set, clean up comments, remove old Dockerfile

* feat: deployment config for fee oracle contract (ethereum-optimism#936)

* feat[contracts]: add GasPriceOracle w/o predeploy

Based on ethereum-optimism#912

* feat[contracts]: congestion price oracle

* chore: add changeset

* contracts: gas price oracle (ethereum-optimism#917)

* contracts: gas price oracle

* tests: update

* fees: fix tests

* contracts: simplify gas price oracle

* lint: fix

* test: execution price is at the 1st storage slot

* chore: rename predeploy to GasPriceOracle

* chore: rename gas price oracle test name

Co-authored-by: Mark Tyneway <[email protected]>
Co-authored-by: Georgios Konstantopoulos <[email protected]>

* Add an L2 deploy script for gas oracle contract

* Add a kovan deployment artifact

* Add deployment to readme

* Add extra validation & initial execution price

* Update README.md

* Fix execution price logic

* Perform new deployment with final contract

* contracts: better require in ovm gas price oracle

* Deploy L2GasPriceOracle

* Update contract to use new fee logic & rename to gas

* Deploy updated contract

* Fix lint

* gas price oracle: do not restrict gas price

* gas price oracle: new deployment

* tests: delete dead test

Co-authored-by: smartcontracts <[email protected]>
Co-authored-by: Mark Tyneway <[email protected]>
Co-authored-by: Georgios Konstantopoulos <[email protected]>

* ops: expose debug namespace (ethereum-optimism#1007)

* fix(sync-service): prevent underflows (ethereum-optimism#1015)

* fix(sync-service): prevent underflows

* chore: add changeset

* chore: remove dead confirmation depth

* chore: remove eth1conf depth from rollup config

* test: remove duplicate value in array (ethereum-optimism#1014)

* ci: tag docker image for canary with abbreviated GITHUB_SHA (ethereum-optimism#1006)

* ci: tag docker image for canary with abbreviated GITHUB_SHA

* ci: update from 6 bytes to 8 bytes of abbreviation

* refactor: improve logging for transactions being submitted to chain with gasPrice (ethereum-optimism#1016)

* refactor: improve logging for transactions being submitted to chain with gasPrice

* lint: apply lint autofixes

* ci: upload logs for failed integration tests (ethereum-optimism#1020)

* fix(dtl): improve slow blocking JSON parsing that occurs during l2 sync (ethereum-optimism#1019)

The use of eth_getBlockRange returns a large response which is very
slow to parse in ethersjs, and can block the event loop for upwards
of multiple seconds.

When this happens, incoming http requests will likely timeout and fail.

Instead, we will parse the incoming http stream directly with the bfj
package, which yields the event loop periodically so that we don't
fail to serve requests.

* fix: lint errors in dtl (ethereum-optimism#1025)

* fix[dtl]: fix dtl bug breaking verifiers (ethereum-optimism#1011)

* fix[dtl]: fix dtl bug breaking verifiers

* tweaks so tests pass

* chore: add changeset

Co-authored-by: Maurelian <[email protected]>
Co-authored-by: smartcontracts <[email protected]>
Co-authored-by: Karl Floersch <[email protected]>
Co-authored-by: Mark Tyneway <[email protected]>
Co-authored-by: Georgios Konstantopoulos <[email protected]>
Co-authored-by: Elena Gesheva <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Kevin Ho <[email protected]>
Co-authored-by: Ben Wilson <[email protected]>
Co-authored-by: Liam Horne <[email protected]>
Co-authored-by: Tim Myers <[email protected]>
  • Loading branch information
12 people authored Jun 7, 2021
1 parent 060ef21 commit 29cc1fd
Show file tree
Hide file tree
Showing 23 changed files with 243 additions and 44 deletions.
5 changes: 5 additions & 0 deletions .changeset/kind-houses-rush.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@eth-optimism/l2geth': patch
---

fix potential underflow when launching the chain when the last verified index is 0
5 changes: 5 additions & 0 deletions .changeset/quick-pandas-laugh.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@eth-optimism/data-transport-layer': patch
---

Fixes a bug that prevented verifiers from syncing properly with the DTL
5 changes: 5 additions & 0 deletions .changeset/wet-falcons-talk.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@eth-optimism/data-transport-layer': patch
---

improve slow blocking JSON parsing that occurs during l2 sync
18 changes: 18 additions & 0 deletions .github/workflows/integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -93,3 +93,21 @@ jobs:
yarn compile
yarn compile:ovm
yarn test:integration:ovm
- name: Collect docker logs on failure
if: failure()
uses: jwalton/gh-docker-logs@v1
with:
images: 'ethereumoptimism/builder,ethereumoptimism/hardhat,ethereumoptimism/deployer,ethereumoptimism/data-transport-layer,ethereumoptimism/l2geth,ethereumoptimism/message-relayer,ethereumoptimism/batch-submitter,ethereumoptimism/l2geth,ethereumoptimism/integration-tests'
dest: './logs'

- name: Tar logs
if: failure()
run: tar cvzf ./logs.tgz ./logs

- name: Upload logs to GitHub
if: failure()
uses: actions/upload-artifact@master
with:
name: logs.tgz
path: ./logs.tgz
10 changes: 5 additions & 5 deletions .github/workflows/publish-canary.yml
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ jobs:
context: .
file: ./ops/docker/Dockerfile.message-relayer
push: true
tags: ethereumoptimism/message-relayer:${{ needs.builder.outputs.message-relayer }}
tags: ethereumoptimism/message-relayer:${{ GITHUB_SHA::8 }}

batch-submitter:
name: Publish Batch Submitter Version ${{ needs.builder.outputs.batch-submitter }}
Expand All @@ -181,7 +181,7 @@ jobs:
context: .
file: ./ops/docker/Dockerfile.batch-submitter
push: true
tags: ethereumoptimism/batch-submitter:${{ needs.builder.outputs.batch-submitter }}
tags: ethereumoptimism/batch-submitter:${{ GITHUB_SHA::8 }}

data-transport-layer:
name: Publish Data Transport Layer Version ${{ needs.builder.outputs.data-transport-layer }}
Expand All @@ -207,7 +207,7 @@ jobs:
context: .
file: ./ops/docker/Dockerfile.data-transport-layer
push: true
tags: ethereumoptimism/data-transport-layer:${{ needs.builder.outputs.data-transport-layer }}
tags: ethereumoptimism/data-transport-layer:${{ GITHUB_SHA::8 }}

contracts:
name: Publish Deployer Version ${{ needs.builder.outputs.contracts }}
Expand All @@ -233,7 +233,7 @@ jobs:
context: .
file: ./ops/docker/Dockerfile.deployer
push: true
tags: ethereumoptimism/deployer:${{ needs.builder.outputs.contracts }}
tags: ethereumoptimism/deployer:${{ GITHUB_SHA::8 }}

integration_tests:
name: Publish Integration tests ${{ needs.builder.outputs.integration-tests }}
Expand All @@ -259,4 +259,4 @@ jobs:
context: .
file: ./ops/docker/Dockerfile.integration-tests
push: true
tags: ethereumoptimism/integration-tests:${{ needs.builder.outputs.integration-tests }}
tags: ethereumoptimism/integration-tests:${{ GITHUB_SHA::8 }}
2 changes: 0 additions & 2 deletions l2geth/rollup/config.go
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,6 @@ import (
type Config struct {
// Maximum calldata size for a Queue Origin Sequencer Tx
MaxCallDataSize int
// Number of confs before applying a L1 to L2 tx
Eth1ConfirmationDepth uint64
// Verifier mode
IsVerifier bool
// Enable the sync service
Expand Down
13 changes: 8 additions & 5 deletions l2geth/rollup/sync_service.go
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,6 @@ type SyncService struct {
syncing atomic.Value
chainHeadSub event.Subscription
OVMContext OVMContext
confirmationDepth uint64
pollInterval time.Duration
timestampRefreshThreshold time.Duration
chainHeadCh chan core.ChainHeadEvent
Expand Down Expand Up @@ -103,7 +102,6 @@ func NewSyncService(ctx context.Context, cfg Config, txpool *core.TxPool, bc *co
cancel: cancel,
verifier: cfg.IsVerifier,
enable: cfg.Eth1SyncServiceEnable,
confirmationDepth: cfg.Eth1ConfirmationDepth,
syncing: atomic.Value{},
bc: bc,
txpool: txpool,
Expand Down Expand Up @@ -244,8 +242,12 @@ func (s *SyncService) initializeLatestL1(ctcDeployHeight *big.Int) error {
s.SetLatestL1Timestamp(context.Timestamp)
s.SetLatestL1BlockNumber(context.BlockNumber)
} else {
// Prevent underflows
if *index != 0 {
*index = *index - 1
}
log.Info("Found latest index", "index", *index)
block := s.bc.GetBlockByNumber(*index - 1)
block := s.bc.GetBlockByNumber(*index)
if block == nil {
block = s.bc.CurrentBlock()
idx := block.Number().Uint64()
Expand All @@ -254,11 +256,12 @@ func (s *SyncService) initializeLatestL1(ctcDeployHeight *big.Int) error {
return fmt.Errorf("Current block height greater than index")
}
s.SetLatestIndex(&idx)
log.Info("Block not found, resetting index", "new", idx, "old", *index-1)
log.Info("Block not found, resetting index", "new", idx, "old", *index)
}
txs := block.Transactions()
if len(txs) != 1 {
log.Error("Unexpected number of transactions in block: %d", len(txs))
log.Error("Unexpected number of transactions in block", "count", len(txs))
panic("Cannot recover OVM Context")
}
tx := txs[0]
s.SetLatestL1Timestamp(tx.L1Timestamp())
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -178,15 +178,18 @@ export class StateBatchSubmitter extends BatchSubmitter {

const nonce = await this.signer.getTransactionCount()
const contractFunction = async (gasPrice): Promise<TransactionReceipt> => {
this.logger.info('Submitting appendStateBatch transaction', {
gasPrice,
nonce,
contractAddr: this.chainContract.address,
})
const contractTx = await this.chainContract.appendStateBatch(
batch,
offsetStartsAtIndex,
{ nonce, gasPrice }
)
this.logger.info('Submitted appendStateBatch transaction', {
nonce,
txHash: contractTx.hash,
contractAddr: this.chainContract.address,
from: contractTx.from,
})
this.logger.debug('appendStateBatch transaction data', {
Expand Down
14 changes: 10 additions & 4 deletions packages/batch-submitter/src/batch-submitter/tx-batch-submitter.ts
Original file line number Diff line number Diff line change
Expand Up @@ -143,14 +143,17 @@ export class TransactionBatchSubmitter extends BatchSubmitter {
const contractFunction = async (
gasPrice
): Promise<TransactionReceipt> => {
this.logger.info('Submitting appendQueueBatch transaction', {
gasPrice,
nonce,
contractAddr: this.chainContract.address,
})
const tx = await this.chainContract.appendQueueBatch(99999999, {
nonce,
gasPrice,
})
this.logger.info('Submitted appendQueueBatch transaction', {
nonce,
txHash: tx.hash,
contractAddr: this.chainContract.address,
from: tx.from,
})
this.logger.debug('appendQueueBatch transaction data', {
Expand Down Expand Up @@ -250,14 +253,17 @@ export class TransactionBatchSubmitter extends BatchSubmitter {

const nonce = await this.signer.getTransactionCount()
const contractFunction = async (gasPrice): Promise<TransactionReceipt> => {
this.logger.info('Submitting appendSequencerBatch transaction', {
gasPrice,
nonce,
contractAddr: this.chainContract.address,
})
const tx = await this.chainContract.appendSequencerBatch(batchParams, {
nonce,
gasPrice,
})
this.logger.info('Submitted appendSequencerBatch transaction', {
nonce,
txHash: tx.hash,
contractAddr: this.chainContract.address,
from: tx.from,
})
this.logger.debug('appendSequencerBatch transaction data', {
Expand Down
1 change: 0 additions & 1 deletion packages/contracts/test/helpers/test-runner/test.types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,6 @@ export const isTestStep_Context = (
'ovmCALLER',
'ovmNUMBER',
'ovmADDRESS',
'ovmNUMBER',
'ovmL1TXORIGIN',
'ovmTIMESTAMP',
'ovmGASLIMIT',
Expand Down
3 changes: 3 additions & 0 deletions packages/data-transport-layer/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,9 @@
"@sentry/node": "^6.3.1",
"@sentry/tracing": "^6.3.1",
"@types/express": "^4.17.11",
"axios": "^0.21.1",
"bcfg": "^0.1.6",
"bfj": "^7.0.2",
"browser-or-node": "^1.3.0",
"cors": "^2.8.5",
"dotenv": "^8.2.0",
Expand All @@ -49,6 +51,7 @@
"@types/levelup": "^4.3.0",
"@types/mocha": "^8.2.2",
"@types/node-fetch": "^2.5.8",
"@types/workerpool": "^6.0.0",
"chai": "^4.3.4",
"chai-as-promised": "^7.1.1",
"hardhat": "^2.2.1",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ import {
import {
SEQUENCER_ENTRYPOINT_ADDRESS,
SEQUENCER_GAS_LIMIT,
parseSignatureVParam,
} from '../../../utils'

export const handleEventsSequencerBatchAppended: EventHandlerSet<
Expand Down Expand Up @@ -76,7 +77,7 @@ export const handleEventsSequencerBatchAppended: EventHandlerSet<
batchExtraData: batchSubmissionEvent.args._extraData,
}
},
parseEvent: (event, extraData) => {
parseEvent: (event, extraData, l2ChainId) => {
const transactionEntries: TransactionEntry[] = []

// It's easier to deal with this data if it's a Buffer.
Expand All @@ -103,7 +104,8 @@ export const handleEventsSequencerBatchAppended: EventHandlerSet<
)

const decoded = maybeDecodeSequencerBatchTransaction(
sequencerTransaction
sequencerTransaction,
l2ChainId
)

transactionEntries.push({
Expand Down Expand Up @@ -234,7 +236,8 @@ const parseSequencerBatchTransaction = (
}

const maybeDecodeSequencerBatchTransaction = (
transaction: Buffer
transaction: Buffer,
l2ChainId: number
): DecodedSequencerBatchTransaction | null => {
try {
const decodedTx = ethers.utils.parseTransaction(transaction)
Expand All @@ -247,7 +250,7 @@ const maybeDecodeSequencerBatchTransaction = (
target: toHexString(decodedTx.to), // Maybe null this out for creations?
data: toHexString(decodedTx.data),
sig: {
v: BigNumber.from(decodedTx.v).toNumber(),
v: parseSignatureVParam(decodedTx.v, l2ChainId),
r: toHexString(decodedTx.r),
s: toHexString(decodedTx.s),
},
Expand Down
13 changes: 11 additions & 2 deletions packages/data-transport-layer/src/services/l1-ingestion/service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ import { fromHexString, EventArgsAddressSet } from '@eth-optimism/core-utils'
import { BaseService } from '@eth-optimism/common-ts'
import { JsonRpcProvider } from '@ethersproject/providers'
import { LevelUp } from 'levelup'
import { ethers, constants } from 'ethers'

/* Imports: Internal */
import { TransportDB } from '../../db/transport-db'
Expand All @@ -18,7 +19,6 @@ import { handleEventsTransactionEnqueued } from './handlers/transaction-enqueued
import { handleEventsSequencerBatchAppended } from './handlers/sequencer-batch-appended'
import { handleEventsStateBatchAppended } from './handlers/state-batch-appended'
import { L1DataTransportServiceOptions } from '../main/service'
import { constants } from 'ethers'

export interface L1IngestionServiceOptions
extends L1DataTransportServiceOptions {
Expand Down Expand Up @@ -65,6 +65,7 @@ export class L1IngestionService extends BaseService<L1IngestionServiceOptions> {
contracts: OptimismContracts
l1RpcProvider: JsonRpcProvider
startingL1BlockNumber: number
l2ChainId: number
} = {} as any

protected async _init(): Promise<void> {
Expand Down Expand Up @@ -114,6 +115,10 @@ export class L1IngestionService extends BaseService<L1IngestionServiceOptions> {
this.options.addressManager
)

this.state.l2ChainId = ethers.BigNumber.from(
await this.state.contracts.OVM_ExecutionManager.ovmCHAINID()
).toNumber()

const startingL1BlockNumber = await this.state.db.getStartingL1Block()
if (startingL1BlockNumber) {
this.state.startingL1BlockNumber = startingL1BlockNumber
Expand Down Expand Up @@ -295,7 +300,11 @@ export class L1IngestionService extends BaseService<L1IngestionServiceOptions> {
event,
this.state.l1RpcProvider
)
const parsedEvent = await handlers.parseEvent(event, extraData)
const parsedEvent = await handlers.parseEvent(
event,
extraData,
this.state.l2ChainId
)
await handlers.storeEvent(parsedEvent, this.state.db)
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import {
padHexString,
SEQUENCER_ENTRYPOINT_ADDRESS,
SEQUENCER_GAS_LIMIT,
parseSignatureVParam,
} from '../../../utils'

export const handleSequencerBlock = {
Expand Down Expand Up @@ -43,7 +44,7 @@ export const handleSequencerBlock = {
if (transaction.queueOrigin === 'sequencer') {
const decodedTransaction: DecodedSequencerBatchTransaction = {
sig: {
v: BigNumber.from(transaction.v).toNumber() - 2 * chainId - 35,
v: parseSignatureVParam(transaction.v, chainId),
r: padHexString(transaction.r, 32),
s: padHexString(transaction.s, 32),
},
Expand Down
32 changes: 27 additions & 5 deletions packages/data-transport-layer/src/services/l2-ingestion/service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ import { BaseService } from '@eth-optimism/common-ts'
import { JsonRpcProvider } from '@ethersproject/providers'
import { BigNumber } from 'ethers'
import { LevelUp } from 'levelup'
import axios from 'axios'
import bfj from 'bfj'

/* Imports: Internal */
import { TransportDB } from '../../db/transport-db'
Expand Down Expand Up @@ -168,11 +170,31 @@ export class L2IngestionService extends BaseService<L2IngestionServiceOptions> {
)
})
} else {
blocks = await this.state.l2RpcProvider.send('eth_getBlockRange', [
toRpcHexString(startBlockNumber),
toRpcHexString(endBlockNumber),
true,
])
// This request returns a large response. Parsing it into JSON inside the ethers library is
// quite slow, and can block the event loop for upwards of multiple seconds. When this happens,
// incoming http requests will likely timeout and fail.
// Instead, we will parse the incoming http stream directly with the bfj package, which yields
// the event loop periodically so that we don't fail to serve requests.
const req = {
jsonrpc: '2.0',
method: 'eth_getBlockRange',
params: [
toRpcHexString(startBlockNumber),
toRpcHexString(endBlockNumber),
true,
],
id: '1',
}

const resp = await axios.post(
this.state.l2RpcProvider.connection.url,
req,
{ responseType: 'stream' }
)
const respJson = await bfj.parse(resp.data, {
yieldRate: 4096, // this yields abit more often than the default of 16384
})
blocks = respJson.data
}

for (const block of blocks) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,8 @@ export type GetExtraDataHandler<TEventArgs, TExtraData> = (

export type ParseEventHandler<TEventArgs, TExtraData, TParsedEvent> = (
event: TypedEthersEvent<TEventArgs>,
extraData: TExtraData
extraData: TExtraData,
l2ChainId: number
) => TParsedEvent

export type StoreEventHandler<TParsedEvent> = (
Expand Down
Loading

0 comments on commit 29cc1fd

Please sign in to comment.