In this talk we will revisit the iteration protocols offered by JavaScript (iterable, iterator, async iterable and async iterator). We will explore generators and async generators as well. We will use this opportunity to discuss some interesting applications of these patterns in Node.js and we will compare them with more traditional approaches like Event Emitters and Node.js Streams. Finally, we will discuss some interesting integrations between traditional approaches and iterators and show some patterns and anti-patterns.
17. Let me introduce myself...
I'm Luciano ( 🍕🍝) 👋
Senior Architect
@loige
15
18. Let me introduce myself...
I'm Luciano ( 🍕🍝) 👋
Senior Architect
nodejsdp.link
Co-Author of Node.js Design Patterns 👉
@loige
15
19. Let me introduce myself...
I'm Luciano ( 🍕🍝) 👋
Senior Architect
nodejsdp.link
Co-Author of Node.js Design Patterns 👉
Connect with me:
(blog)
(twitter)
(twitch)
(github)
loige.co
@loige
loige
lmammino
@loige
15
20. We are business focused
technologists that deliver.
| |
Accelerated Serverless AI as a Service Platform Modernisation
WE ARE HIRING: Do you want to ?
work with us
@loige 16
21. Iteration protocols
what? why? 🤔
@loige
An attempt at standardizing "iteration" behaviors
providing a consistent and interoperable API
Use for...of, for await...of and spread operator.
You can also create lazy iterators
You can also deal with async iteration
You can create your own custom iterators/iterables
17
50. Iterator protocol
In JavaScript, an object is an iterator if it has a next()
method. Every time you call it, it returns an object with the
keys done (boolean) and value.
@loige
28
59. Iterable protocol
An object is iterable if it implements the @@iterator*
method, a zero-argument function that returns an iterator.
* Symbol.iterator
@loige
30
80. Async Iterator protocol
An object is an async iterator if it has a next() method. Every
time you call it, it returns a promise that resolves to an
object with the keys done (boolean) and value.
@loige
37
91. Async Iterable protocol
An object is an async iterable if it implements the
@@asyncIterator* method, a zero-argument function that
returns an async iterator.
* Symbol.asyncIterator
@loige
40
106. When to use async iterators
@loige
Sequential iteration pattern
Data arriving in order over time
You need to complete processing the current “chunk”
before you can request the next one
Examples
paginated iteration
consuming tasks from a remote queue
44
117. import { createReadStream } from 'fs'
import { once } from 'events'
const sourceStream = createReadStream('bigdata.csv')
const destStream = new SlowTransform()
for await (const chunk of sourceStream) {
const canContinue = destStream.write(chunk)
if (!canContinue) {
// backpressure, now we stop and we need to wait for drain
await once(destStream, 'drain')
// ok now it's safe to resume writing
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
@loige
48
118. import { createReadStream } from 'fs'
import { once } from 'events'
const sourceStream = createReadStream('bigdata.csv')
const destStream = new SlowTransform()
for await (const chunk of sourceStream) {
const canContinue = destStream.write(chunk)
if (!canContinue) {
// backpressure, now we stop and we need to wait for drain
await once(destStream, 'drain')
// ok now it's safe to resume writing
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
import { createReadStream } from 'fs'
import { once } from 'events'
1
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
@loige
48
119. import { createReadStream } from 'fs'
import { once } from 'events'
const sourceStream = createReadStream('bigdata.csv')
const destStream = new SlowTransform()
for await (const chunk of sourceStream) {
const canContinue = destStream.write(chunk)
if (!canContinue) {
// backpressure, now we stop and we need to wait for drain
await once(destStream, 'drain')
// ok now it's safe to resume writing
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
import { createReadStream } from 'fs'
import { once } from 'events'
1
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
const sourceStream = createReadStream('bigdata.csv')
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
@loige
48
120. import { createReadStream } from 'fs'
import { once } from 'events'
const sourceStream = createReadStream('bigdata.csv')
const destStream = new SlowTransform()
for await (const chunk of sourceStream) {
const canContinue = destStream.write(chunk)
if (!canContinue) {
// backpressure, now we stop and we need to wait for drain
await once(destStream, 'drain')
// ok now it's safe to resume writing
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
import { createReadStream } from 'fs'
import { once } from 'events'
1
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
const sourceStream = createReadStream('bigdata.csv')
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
const destStream = new SlowTransform()
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
const sourceStream = createReadStream('bigdata.csv')
4
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
@loige
48
121. import { createReadStream } from 'fs'
import { once } from 'events'
const sourceStream = createReadStream('bigdata.csv')
const destStream = new SlowTransform()
for await (const chunk of sourceStream) {
const canContinue = destStream.write(chunk)
if (!canContinue) {
// backpressure, now we stop and we need to wait for drain
await once(destStream, 'drain')
// ok now it's safe to resume writing
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
import { createReadStream } from 'fs'
import { once } from 'events'
1
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
const sourceStream = createReadStream('bigdata.csv')
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
const destStream = new SlowTransform()
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
const sourceStream = createReadStream('bigdata.csv')
4
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
for await (const chunk of sourceStream) {
}
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
14
@loige
48
122. import { createReadStream } from 'fs'
import { once } from 'events'
const sourceStream = createReadStream('bigdata.csv')
const destStream = new SlowTransform()
for await (const chunk of sourceStream) {
const canContinue = destStream.write(chunk)
if (!canContinue) {
// backpressure, now we stop and we need to wait for drain
await once(destStream, 'drain')
// ok now it's safe to resume writing
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
import { createReadStream } from 'fs'
import { once } from 'events'
1
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
const sourceStream = createReadStream('bigdata.csv')
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
const destStream = new SlowTransform()
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
const sourceStream = createReadStream('bigdata.csv')
4
5
6
for await (const chunk of sourceStream) {
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
for await (const chunk of sourceStream) {
}
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
7
const canContinue = destStream.write(chunk)
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
14
const canContinue = destStream.write(chunk)
import { createReadStream } from 'fs'
1
import { once } from 'events'
2
3
const sourceStream = createReadStream('bigdata.csv')
4
const destStream = new SlowTransform()
5
6
for await (const chunk of sourceStream) {
7
8
if (!canContinue) {
9
// backpressure, now we stop and we need to wait for drain
10
await once(destStream, 'drain')
11
// ok now it's safe to resume writing
12
}
13
}
14
@loige
48