-
-
Notifications
You must be signed in to change notification settings - Fork 643
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor(core): more readability for addBatchFunc #2898
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. |
commit: |
|
Playground | Link |
---|---|
React demo | https://livecodes.io?x=id/FWY3GN333 |
See documentations for usage instructions.
9024f9e
to
a814e0e
Compare
I thought this would decrease the bundle size, but why does this increase? Size Change: +205 B (+0.22%) Total Size: 92.2 kB
ℹ️ View Unchanged
|
a814e0e
to
af76e78
Compare
af76e78
to
3f3edde
Compare
|
3f3edde
to
57f8b55
Compare
57f8b55
to
a0d661b
Compare
a0d661b
to
6f9682d
Compare
6f9682d
to
c393cbf
Compare
src/vanilla/store.ts
Outdated
@@ -163,30 +163,24 @@ const addDependency = <Value>( | |||
// Batch | |||
// | |||
|
|||
type BatchPriority = 'H' | 'M' | 'L' | |||
type BatchPriority = 0 | 1 | 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can use numbers as object keys (internally they will be converted into strings.)
So, my proposition is:
type Batch = Readonly<{
0: Set<() => void>
1: Set<() => void>
2: Set<() => void>
D: Map<AnyAtom, Set<AnyAtom>>
}>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/pmndrs/jotai/pull/2898/files#r1900329774
If we were to generalize the batch queues, I might consider a single ordered queue. Not sure about the performance or the bungle size, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A queue queue might be nice. We wouldn't need explicit placeholders defined in the store for third-party applications.
Its a little more on bundle size though.
Rough:
const batch = (() => {
const H = new Set()
const M = new Set()
const L = new Set()
const queue = [H, M, L]
const D = new Map()
const batch = Object.assign(queue, { H, M, L, D })
return batch
})()
...
const syncEffect = new Set()
// schedule sync effect after H, but before M
const hIndex = batch.findIndex(element => element === batch.H)
batch.splice(hIndex + 1, 0, syncEffect)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I meant a single combined queue. But don't take it seriously as it may have performance drawbacks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For now, I think queue array is better. 8945774
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if we could also allow "insert queue after..." capability.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In that case, I'd consider a single priority queue.
dd86bde
to
2b5737e
Compare
bf39b1f
to
82e4389
Compare
82e4389
to
fed35ef
Compare
fed35ef
to
014b4e3
Compare
014b4e3
to
b9b6867
Compare
b9b6867
to
c43739e
Compare
c43739e
to
f2ac804
Compare
Using iterable batch is still under consideration. However, I'm leaning towards it. |
const createBatch = (): Batch => { | ||
const batch = { D: new Map() } as { | ||
D: Map<AnyAtom, Set<AnyAtom>> | ||
[BATCH_PRIORITY_HIGH]: Set<() => void> | ||
[BATCH_PRIORITY_MEDIUM]: Set<() => void> | ||
[BATCH_PRIORITY_LOW]: Set<() => void> | ||
} | ||
batch[BATCH_PRIORITY_HIGH] = new Set() | ||
batch[BATCH_PRIORITY_MEDIUM] = new Set() | ||
batch[BATCH_PRIORITY_LOW] = new Set() | ||
return batch | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary
Adds iterable to batch to make flushBatch agnostic to the internals of batch.edit: Just refactoring for now delaying the decision
Check List
pnpm run prettier
for formatting code and docs