You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to index contracts on arbitrum-goerli that are added to subgraph dynamically using templates. Once there are more than 1000 contracts added I will get following warning in hosted service and subgraph halts indexing. Trying again after eth_getLogs RPC call for block range: [25628130..25628130] failed (attempt #10) with result Err(Rpc(Error { code: InvalidParams, message: "1001 addresses specified in query, but only 1000 are allowed", data: None })), component: BlockStream
See my deployed subgraph here. Also, every time a new contract is added using templates a single PotentialNonBlockingLzApp is created. To query those use
{
potentialNonBlockingLzApps (skip: 1000, first: 1000) {
id
}
}
and my subgraph indeed returns 1001 entities.
I assume arbitrum goerli node used under the hood does not allow to eth_getLogs of more than 1000 addresses at once. I have deployed same subgraph on fantom-testnet, fuji, mumbai and it is syncing without any issues.
As I don't see this issue on other chains I assume it is a bug.
The text was updated successfully, but these errors were encountered:
Hello,
I am trying to index contracts on
arbitrum-goerli
that are added to subgraph dynamically using templates. Once there are more than 1000 contracts added I will get following warning in hosted service and subgraph halts indexing.Trying again after eth_getLogs RPC call for block range: [25628130..25628130] failed (attempt #10) with result Err(Rpc(Error { code: InvalidParams, message: "1001 addresses specified in query, but only 1000 are allowed", data: None })), component: BlockStream
See my deployed subgraph here. Also, every time a new contract is added using templates a single
PotentialNonBlockingLzApp
is created. To query those useand my subgraph indeed returns 1001 entities.
I assume arbitrum goerli node used under the hood does not allow to
eth_getLogs
of more than 1000 addresses at once. I have deployed same subgraph onfantom-testnet
,fuji
,mumbai
and it is syncing without any issues.As I don't see this issue on other chains I assume it is a bug.
The text was updated successfully, but these errors were encountered: