-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deduplicate triggers #4055
Deduplicate triggers #4055
Changes from all commits
b761633
7c2b800
7147caf
4c43d45
3525da2
187ef86
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -173,9 +173,31 @@ where | |
} | ||
|
||
impl<C: Blockchain> BlockWithTriggers<C> { | ||
pub fn new(block: C::Block, mut trigger_data: Vec<C::TriggerData>) -> Self { | ||
/// Creates a BlockWithTriggers structure, which holds | ||
/// the trigger data ordered and without any duplicates. | ||
pub fn new(block: C::Block, mut trigger_data: Vec<C::TriggerData>, logger: &Logger) -> Self { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This to me sounds like a lot of logic for a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I just added a doc comment, what do you think? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't feel strongly about it but I feel like reading the code you would not expect new to do the sort + dedup etc. The doc helps though I'll leave the rest up to you. |
||
// This is where triggers get sorted. | ||
trigger_data.sort(); | ||
|
||
let old_len = trigger_data.len(); | ||
mangas marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
// This is removing the duplicate triggers in the case of multiple | ||
// data sources fetching the same event/call/etc. | ||
trigger_data.dedup(); | ||
|
||
let new_len = trigger_data.len(); | ||
|
||
if new_len != old_len { | ||
debug!( | ||
logger, | ||
"Trigger data had duplicate triggers"; | ||
"block_number" => block.number(), | ||
"block_hash" => block.hash().hash_hex(), | ||
"old_length" => old_len, | ||
"new_length" => new_len, | ||
); | ||
} | ||
|
||
Self { | ||
block, | ||
trigger_data, | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since this is mentioning the network impact it's important to note any impact to indexers and any actions they should take.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, thanks for the feedback I'll address it 🙂
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just pushed what they should do, what do you think? @leoyvens
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm