SlideShare a Scribd company logo
Luciano Mammino (@loige)
IT’S ABOUT TIME TOIT’S ABOUT TIME TO
EMBRACE STREAMSEMBRACE STREAMS
  IT’S ABOUT TIME TOIT’S ABOUT TIME TO
EMBRACE STREAMSEMBRACE STREAMS
  
Luciano Mammino (@loige)
05/03/2019
loige.link/streams-dub
1
loige.link/streams-dub
code: loige.link/streams-examples
2
HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 
Cloud Architect
Let's connect!
Blog: 
Twitter: 
GitHub:   
loige.co
@loige
@lmammino
loige.link/node-patterns
fstack.link
with @mariocasciaro
with @andreaman87
3
AGENDAAGENDA
01. Buffers VS Streams 
02. Stream types & APIs 
03. Pipe()
04. Streams utilities 
05. Writing custom streams
06. Streams in the Browser
@loige4
01. BUFFERS VS01. BUFFERS VS  
        STREAMS        STREAMS
@loige5
BUFFERBUFFER: DATA STRUCTURE TO STORE AND: DATA STRUCTURE TO STORE AND
TRANSFER ARBITRARY BINARY DATATRANSFER ARBITRARY BINARY DATA
@loige
*Note that this is loading all the content of the file in memory
*
6
STREAMSTREAM: ABSTRACT INTERFACE FOR: ABSTRACT INTERFACE FOR
WORKING WITH STREAMING DATAWORKING WITH STREAMING DATA
@loige
*It does not load all the data straight away
*
7
LET'S SOLVE THIS PROBLEMLET'S SOLVE THIS PROBLEM
1. Read the content of a file 
2. copy it to another file*
* cp in Node.js
@loige8
THE BUFFER WAYTHE BUFFER WAY
// buffer-copy.js
const { readFileSync, writeFileSync } = require('fs')
const [,, src, dest] = process.argv
const content = readFileSync(src)
writeFileSync(dest, content)
@loige9
THE STREAM WAYTHE STREAM WAY
// stream-copy.js
const {
createReadStream,
createWriteStream
} = require('fs')
const [,, src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', (data) => destStream.write(data))
@loige
* Careful: this implementation is not optimal
*
10
MEMORY COMPARISON (~600MB FILE)MEMORY COMPARISON (~600MB FILE)
node ­­inspect­brk buffer­copy.js assets/poster.psd ~/Downloads/poster.psd
@loige11
MEMORY COMPARISON (~600MB FILE)MEMORY COMPARISON (~600MB FILE)
node ­­inspect­brk stream­copy.js assets/poster.psd ~/Downloads/poster.psd
@loige12
LET'S TRY WITH A BIG FILE (~10GB)LET'S TRY WITH A BIG FILE (~10GB)
@loige13
LET'S TRY WITH A BIG FILE (~10GB)LET'S TRY WITH A BIG FILE (~10GB)
node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv
@loige14
IF BYTES IF BYTES WEREWERE BLOCKS... BLOCKS...@loige
15
BIG BUFFER APPROACHBIG BUFFER APPROACH@loige
16
STREAMING APPROACHSTREAMING APPROACH@loige
17
 STREAMS VS BUFFERS  STREAMS VS BUFFERS 
Streams keep a low memory footprint
even with large amounts of data
Streams allows you to process data as
soon as it arrives
Stream processing generally does not
block the event loop
@loige18
03. STREAM TYPES03. STREAM TYPES  
       & APIS       & APIS
@loige19
ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERSEVENT EMITTERS
A stream instance is an object that emits events when its internal
state changes, for instance:
s.on('readable', () => {}) // ready to be consumed
s.on('data', (chunk) => {}) // new data is available
s.on('error', (err) => {}) // some error happened
s.on('end', () => {}) // no more data available
The events available depend from the type of stream
@loige20
WRITABLEWRITABLE STREAMS STREAMS
A writable stream is an abstraction that allows to write data over a destination
 
Examples:
fs writeStream
process.stdout, process.stderr
HTTP request (client-side)
HTTP response (server-side)
AWS S3 PutObject (body parameter)
@loige21
WRITABLE STREAMS - WRITABLE STREAMS - METHODSMETHODS
writable.write(chunk, [encoding], [callback])
writable.end([chunk], [encoding], [callback])
@loige22
writable.on('drain')
writable.on('close')
writable.on('finish')
writable.on('error', (err) => {})
WRITABLE STREAMS - WRITABLE STREAMS - EVENTSEVENTS
@loige23
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige24
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige24
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige24
// writable-http-request.js
const http = require('http')
const req = http.request(
{
hostname: 'enx6b07hdu6cs.x.pipedream.net',
method: 'POST'
},
resp => {
console.log(`Server responded with "${resp.statusCode}"`)
}
)
req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))
req.write('writing some content...n')
req.end('last write & close the stream')
@loige24
@loige25
loige.link/writable-http-req
@loige26
BACKPRESSUREBACKPRESSURE
When writing large amounts of data you
should make sure you handle the stop write
signal and the drain event
 
loige.link/backpressure
@loige27
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige28
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige28
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige28
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige28
// stream-copy-safe.js
const { createReadStream, createWriteStream } = require('fs')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
// we are overflowing the destination, we should pause
srcStream.pause()
// we will resume when the destination stream is drained
destStream.once('drain', () => srcStream.resume())
}
})
@loige28
READABLEREADABLE STREAMS STREAMS
A readable stream represents a source from which data is consumed.
Examples:
fs readStream
process.stdin
HTTP response (client-side)
HTTP request (server-side)
AWS S3 GetObject (data field)
It supports two modes for data consumption: flowing and paused (or non-
flowing) mode.
@loige29
READABLE STREAMS - READABLE STREAMS - METHODSMETHODS
readable.read([size])
readable.pause()
readable.resume()
@loige30
READABLE STREAMS - READABLE STREAMS - EVENTSEVENTS
readable.on('readable')
readable.on('data', (chunk) => {})
readable.on('end')
readable.on('error', (err) => {})
@loige31
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
@loige32
@loige
1
2
3
Source data
Readable stream in
flowing mode
data listener
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
33
@loige
12
3
Source data
Readable stream in
flowing mode
Read
data listener
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
34
@loige
12
3
Source data
Readable stream in
flowing mode
data listener
data
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
35
@loige
2
3
Source data
Readable stream in
flowing mode
data listener
Read
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
36
@loige
2
3
Source data
Readable stream in
flowing mode
data listener
data
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
37
@loige
3
Source data
Readable stream in
flowing mode
data listener
Read
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
38
@loige
3
Source data
Readable stream in
flowing mode
data listener
data
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
39
@loige
Source data
Readable stream in
flowing mode
Read
data listener
(end)
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
40
@loige
Source data
Readable stream in
flowing mode
data listener
end
(end)
When no more data is available, end is emitted.
READABLE STREAMS - READABLE STREAMS - FLOWING MODEFLOWING MODE
Data is read from source automatically and chunks are emitted as soon
as they are available.
41
// count-emojis-flowing.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
file.on('data', chunk => {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`))
@loige42
// count-emojis-flowing.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
file.on('data', chunk => {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`))
@loige42
loige.link/st_patrick
@loige43
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
@loige44
@loige
Source data
Readable stream in
paused mode
consumer
1
2
3
<      >
(internal buffer)
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
45
@loige
Source data
consumer
1
2
3
<      >
(internal buffer)
Read
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
46
@loige
Source data
consumer
1
2
3
<      >
(internal buffer)
readable
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
47
@loige
Source data
consumer
1
2
3
<      >
(internal buffer)
Nothing happens until the consumer decides to read the data
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
48
<      >
@loige
Source data
consumer
12
3
(internal buffer)
read()
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
49
<      >
@loige
Source data
consumer
2
3
(internal buffer)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
50
<      >
@loige
Source data
consumer
null2
3
(internal buffer)
read()
Using read() with an empty buffer will return null (stop reading signal)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
51
<      >
@loige
Source data
consumer
2
3
(internal buffer)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
52
<      >
@loige
Source data
consumer
2
3
(internal buffer)
Read
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
53
<      >
@loige
Source data
consumer
2
3
(internal buffer)
readable
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
54
<      >
@loige
Source data
consumer
2
3
(internal buffer)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
55
<      >
@loige
Source data
consumer
2
3
(internal buffer)
read()
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
56
<      >
@loige
Source data
consumer
3
(internal buffer)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
57
<      >
@loige
Source data
consumer
3
(internal buffer)
Read
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
58
<      >
@loige
Source data
consumer
3
(internal buffer)
readable
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
59
<      >
@loige
Source data
consumer
3
(internal buffer)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
60
<      >
@loige
Source data
consumer
3(internal buffer)
read()
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
61
<      >
@loige
Source data
consumer
(internal buffer)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
62
<      >
@loige
Source data
consumer
(internal buffer)
Read
(end)
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
63
<      >
@loige
Source data
consumer
(internal buffer)
(end)
end
Readable stream in
paused mode
READABLE STREAMS - READABLE STREAMS - PAUSED MODEPAUSED MODE
A consumer has to call the read method explicitly to read chunks of data
from the stream. The stream sends a readable event to signal that new
data is available.
64
// count-emojis-paused.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
file.on('readable', () => {
let chunk
while ((chunk = file.read()) !== null) {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
}
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`)) @loige65
// count-emojis-paused.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
file.on('readable', () => {
let chunk
while ((chunk = file.read()) !== null) {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
}
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`)) @loige65
// count-emojis-paused.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
file.on('readable', () => {
let chunk
while ((chunk = file.read()) !== null) {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
}
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`)) @loige65
@loige66
READABLE STREAMSREADABLE STREAMS  
MODE SWITCH CONDITIONSMODE SWITCH CONDITIONS
All readable streams are created in paused mode
paused streams can be switched to flowing mode with:
stream.on('data', () => {})
stream.resume()
stream.pipe()
flowing streams can switch back to paused with:
stream.pause()
stream.unpipe() for all attached streams
@loige67
READABLE STREAMSREADABLE STREAMS  
FLOWINGFLOWING VS  VS PAUSEDPAUSED
Push VS Pull mental models
Flowing is simpler to use
Paused gives you more control on how data is consumed from
the source
Whichever you pick, stay consistent!
@loige68 . 1
BONUS MODEBONUS MODE
Readable streams are also Async Iterators 
(Node.js 10+)
 
 
Warning: still experimental
@loige68 . 2
// count-emojis-async-iterator.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
async function main () {
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
for await (let chunk of file) {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
}
console.log(`Found ${counter} emojis`)
}
main() @loige68 . 3
// count-emojis-async-iterator.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm
async function main () {
const emojis = Object.keys(EMOJI_MAP)
const file = createReadStream(process.argv[2])
let counter = 0
for await (let chunk of file) {
for (let char of chunk.toString('utf8')) {
if (emojis.includes(char)) {
counter++
}
}
}
console.log(`Found ${counter} emojis`)
}
main() @loige68 . 3
Still experimental in Node.js 11
If you like this API and don't want to rely on an experimental core feature: 
2ality.com/2018/04/async-iter-nodejs.html
github.com/lorenzofox3/for-await
@loige68 . 4
OBJECT MODEOBJECT MODE
Readable streams can emit objects if this mode is enabled
// readable-timer.js
const { Readable } = require('stream')
const timerStream = new Readable({
objectMode: true,
read () {
this.push(new Date())
}
})
timerStream.on('data', (currentDate) => {
// prints the current second
console.log(currentDate.getSeconds())
})
@loige69
OBJECT MODEOBJECT MODE
Readable streams can emit objects if this mode is enabled
// readable-timer.js
const { Readable } = require('stream')
const timerStream = new Readable({
objectMode: true,
read () {
this.push(new Date())
}
})
timerStream.on('data', (currentDate) => {
// prints the current second
console.log(currentDate.getSeconds())
})
@loige69
OBJECT MODEOBJECT MODE
Readable streams can emit objects if this mode is enabled
// readable-timer.js
const { Readable } = require('stream')
const timerStream = new Readable({
objectMode: true,
read () {
this.push(new Date())
}
})
timerStream.on('data', (currentDate) => {
// prints the current second
console.log(currentDate.getSeconds())
})
@loige
This is an object
69
OBJECT MODEOBJECT MODE
Readable streams can emit objects if this mode is enabled
// readable-timer.js
const { Readable } = require('stream')
const timerStream = new Readable({
objectMode: true,
read () {
this.push(new Date())
}
})
timerStream.on('data', (currentDate) => {
// prints the current second
console.log(currentDate.getSeconds())
})
@loige69
OTHER TYPES OF STREAMOTHER TYPES OF STREAM
Duplex Stream 
streams that are both Readable and Writable.  
(net.Socket) 
 
Transform Stream 
Duplex streams that can modify or transform the data as it is written
and read. 
(zlib.createGzip(), crypto.createCipheriv())
@loige70
ANATOMY OF A TRANSFORM STREAMANATOMY OF A TRANSFORM STREAM
1. write data
transform stream
3. read transformed data2. transform the data
(readable stream) (writable stream)
@loige71
GZIP EXAMPLEGZIP EXAMPLE
1. write data
transform stream
3. read transformed data2. transform the data
(readable stream) (writable stream)
@loige
Uncompressed data Compressed data
compress
gzip.createGzip()
72
gzipStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
gzipStream.pause()
destStream.once('drain', () => {
gzipStream.resume()
})
}
})
gzipStream.on('end', () => {
destStream.end()
})
// TODO: handle errors! >:)
// stream-copy-gzip.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = gzipStream.write(data)
if (!canContinue) {
srcStream.pause()
gzipStream.once('drain', () => {
srcStream.resume()
})
}
})
srcStream.on('end', () => {
// check if there's buffered data left
const remainingData = gzipStream.read()
if (remainingData !== null) {
destStream.write()
}
gzipStream.end()
})
@loige73
gzipStream.on('data', data => {
const canContinue = destStream.write(data)
if (!canContinue) {
gzipStream.pause()
destStream.once('drain', () => {
gzipStream.resume()
})
}
})
gzipStream.on('end', () => {
destStream.end()
})
// TODO: handle errors! >:)
// stream-copy-gzip.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream.on('data', data => {
const canContinue = gzipStream.write(data)
if (!canContinue) {
srcStream.pause()
gzipStream.once('drain', () => {
srcStream.resume()
})
}
})
srcStream.on('end', () => {
// check if there's buffered data left
const remainingData = gzipStream.read()
if (remainingData !== null) {
destStream.write()
}
gzipStream.end()
})
@loige73
03. PIPE()03. PIPE()
@loige74
readable
.pipe(tranform1)
.pipe(transform2)
.pipe(transform3)
.pipe(writable)
readable.pipe(writableDest)
@loige
Connects a readable stream to a writable stream
A transform stream can be used as a destination as well
It returns the destination stream allowing for a chain of pipes
75
// stream-copy-gzip-pipe.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)
@loige76
// stream-copy-gzip-pipe.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)
@loige76
// stream-copy-gzip-pipe.js
const {
createReadStream,
createWriteStream
} = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)
srcStream
.pipe(gzipStream)
.pipe(destStream)
@loige76
readable
.pipe(decompress)
.pipe(decrypt)
.pipe(convert)
.pipe(encrypt)
.pipe(compress)
.pipe(writeToDisk)
Setup complex pipelines with pipe
@loige
This is the most common way to use streams
77
readable
.on('error', handleErr)
.pipe(decompress)
.on('error', handleErr)
.pipe(decrypt)
.on('error', handleErr)
.pipe(convert)
.on('error', handleErr)
.pipe(encrypt)
.on('error', handleErr)
.pipe(compress)
.on('error', handleErr)
.pipe(writeToDisk)
.on('error', handleErr)
Handling errors (correctly)
@loige78
readable
.on('error', handleErr)
.pipe(decompress)
.on('error', handleErr)
.pipe(decrypt)
.on('error', handleErr)
.pipe(convert)
.on('error', handleErr)
.pipe(encrypt)
.on('error', handleErr)
.pipe(compress)
.on('error', handleErr)
.pipe(writeToDisk)
.on('error', handleErr)
Handling errors (correctly)
@loige
 
handleErr should end and destroy the streams
(it doesn't happen automatically)
 
78
readable
.on('error', handleErr)
.pipe(decompress)
.on('error', handleErr)
.pipe(decrypt)
.on('error', handleErr)
.pipe(convert)
.on('error', handleErr)
.pipe(encrypt)
.on('error', handleErr)
.pipe(compress)
.on('error', handleErr)
.pipe(writeToDisk)
.on('error', handleErr)
Handling errors (correctly)
@loige
 
handleErr should end and destroy the streams
(it doesn't happen automatically)
 
78
04. STREAM UTILITIES04. STREAM UTILITIES
@loige79
// stream-copy-gzip-pipeline.js
const { pipeline } = require('stream')
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pipeline(
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline(...streams, callback)
@loige80
// stream-copy-gzip-pipeline.js
const { pipeline } = require('stream')
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pipeline(
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline(...streams, callback)
@loige
Can pass multiple streams (they will be piped)
80
// stream-copy-gzip-pipeline.js
const { pipeline } = require('stream')
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pipeline(
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline(...streams, callback)
@loige
Can pass multiple streams (they will be piped)
The last argument is a callback. If invoked with an
error, it means the pipeline failed at some point.
All the streams are ended and destroyed correctly.
80
// stream-copy-gzip-pump.js
const pump = require('pump') // from npm
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pump( // just swap pipeline with pump!
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline is available in Node.js 10+
In older systems you can use pump - npm.im/pump
@loige81
// stream-copy-gzip-pump.js
const pump = require('pump') // from npm
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')
const [, , src, dest] = process.argv
pump( // just swap pipeline with pump!
createReadStream(src),
createGzip(),
createWriteStream(dest),
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
console.log('Done!')
}
)
stream.pipeline is available in Node.js 10+
In older systems you can use pump - npm.im/pump
@loige81
pumpify(...streams) - 
Create reusable pieces of pipeline
npm.im/pumpify
@loige
Let's create EncGz, an application that
helps us to read and write encrypted-
gzipped files
82
// encgz-stream.js - utility library
const {
createCipheriv,
createDecipheriv,
randomBytes,
createHash
} = require('crypto')
const { createGzip, createGunzip } = require('zlib')
const pumpify = require('pumpify') // from npm
// calculates md5 of the secret (trimmed)
function getChiperKey (secret) {}
function createEncgz (secret) {
const initVect = randomBytes(16)
const cipherKey = getChiperKey(secret)
const encryptStream = createCipheriv('aes256', cipherKey, initVect)
const gzipStream = createGzip()
const stream = pumpify(encryptStream, gzipStream)
stream.initVect = initVect
return stream
}
@loige83
// encgz-stream.js - utility library
const {
createCipheriv,
createDecipheriv,
randomBytes,
createHash
} = require('crypto')
const { createGzip, createGunzip } = require('zlib')
const pumpify = require('pumpify') // from npm
// calculates md5 of the secret (trimmed)
function getChiperKey (secret) {}
function createEncgz (secret) {
const initVect = randomBytes(16)
const cipherKey = getChiperKey(secret)
const encryptStream = createCipheriv('aes256', cipherKey, initVect)
const gzipStream = createGzip()
const stream = pumpify(encryptStream, gzipStream)
stream.initVect = initVect
return stream
}
@loige83
// encgz-stream.js - utility library
const {
createCipheriv,
createDecipheriv,
randomBytes,
createHash
} = require('crypto')
const { createGzip, createGunzip } = require('zlib')
const pumpify = require('pumpify') // from npm
// calculates md5 of the secret (trimmed)
function getChiperKey (secret) {}
function createEncgz (secret) {
const initVect = randomBytes(16)
const cipherKey = getChiperKey(secret)
const encryptStream = createCipheriv('aes256', cipherKey, initVect)
const gzipStream = createGzip()
const stream = pumpify(encryptStream, gzipStream)
stream.initVect = initVect
return stream
}
@loige83
// encgz-stream.js - utility library
const {
createCipheriv,
createDecipheriv,
randomBytes,
createHash
} = require('crypto')
const { createGzip, createGunzip } = require('zlib')
const pumpify = require('pumpify') // from npm
// calculates md5 of the secret (trimmed)
function getChiperKey (secret) {}
function createEncgz (secret) {
const initVect = randomBytes(16)
const cipherKey = getChiperKey(secret)
const encryptStream = createCipheriv('aes256', cipherKey, initVect)
const gzipStream = createGzip()
const stream = pumpify(encryptStream, gzipStream)
stream.initVect = initVect
return stream
}
@loige83
// encgz-stream.js (...continue from previous slide)
function createDecgz (secret, initVect) {
const cipherKey = getChiperKey(secret)
const decryptStream = createDecipheriv('aes256', cipherKey, initVect)
const gunzipStream = createGunzip()
const stream = pumpify(gunzipStream, decryptStream)
return stream
}
module.exports = {
createEncgz,
createDecgz
}
@loige84
// encgz-stream.js (...continue from previous slide)
function createDecgz (secret, initVect) {
const cipherKey = getChiperKey(secret)
const decryptStream = createDecipheriv('aes256', cipherKey, initVect)
const gunzipStream = createGunzip()
const stream = pumpify(gunzipStream, decryptStream)
return stream
}
module.exports = {
createEncgz,
createDecgz
}
@loige84
// encgz-stream.js (...continue from previous slide)
function createDecgz (secret, initVect) {
const cipherKey = getChiperKey(secret)
const decryptStream = createDecipheriv('aes256', cipherKey, initVect)
const gunzipStream = createGunzip()
const stream = pumpify(gunzipStream, decryptStream)
return stream
}
module.exports = {
createEncgz,
createDecgz
}
@loige84
// encgz-stream.js (...continue from previous slide)
function createDecgz (secret, initVect) {
const cipherKey = getChiperKey(secret)
const decryptStream = createDecipheriv('aes256', cipherKey, initVect)
const gunzipStream = createGunzip()
const stream = pumpify(gunzipStream, decryptStream)
return stream
}
module.exports = {
createEncgz,
createDecgz
}
@loige84
// encgz-stream.js (...continue from previous slide)
function createDecgz (secret, initVect) {
const cipherKey = getChiperKey(secret)
const decryptStream = createDecipheriv('aes256', cipherKey, initVect)
const gunzipStream = createGunzip()
const stream = pumpify(gunzipStream, decryptStream)
return stream
}
module.exports = {
createEncgz,
createDecgz
}
@loige84
// encgz.js - CLI to encrypt and gzip (from stdin to stdout)
const { pipeline } = require('stream')
const { createEncgz } = require('./encgz-stream')
const [, , secret] = process.argv
const encgz = createEncgz(secret)
console.error(`init vector: ${encgz.initVect.toString('hex')}`)
pipeline(
process.stdin,
encgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
) @loige85
// encgz.js - CLI to encrypt and gzip (from stdin to stdout)
const { pipeline } = require('stream')
const { createEncgz } = require('./encgz-stream')
const [, , secret] = process.argv
const encgz = createEncgz(secret)
console.error(`init vector: ${encgz.initVect.toString('hex')}`)
pipeline(
process.stdin,
encgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
) @loige85
// encgz.js - CLI to encrypt and gzip (from stdin to stdout)
const { pipeline } = require('stream')
const { createEncgz } = require('./encgz-stream')
const [, , secret] = process.argv
const encgz = createEncgz(secret)
console.error(`init vector: ${encgz.initVect.toString('hex')}`)
pipeline(
process.stdin,
encgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
) @loige85
// encgz.js - CLI to encrypt and gzip (from stdin to stdout)
const { pipeline } = require('stream')
const { createEncgz } = require('./encgz-stream')
const [, , secret] = process.argv
const encgz = createEncgz(secret)
console.error(`init vector: ${encgz.initVect.toString('hex')}`)
pipeline(
process.stdin,
encgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
) @loige85
// decgz.js - CLI to gunzip and decrypt (from stdin to stdout)
const { pipeline } = require('stream')
const { createDecgz } = require('./encgz-stream')
const [, , secret, initVect] = process.argv
const decgz = createDecgz(secret, Buffer.from(initVect, 'hex'))
pipeline(
process.stdin,
decgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
)
@loige86
// decgz.js - CLI to gunzip and decrypt (from stdin to stdout)
const { pipeline } = require('stream')
const { createDecgz } = require('./encgz-stream')
const [, , secret, initVect] = process.argv
const decgz = createDecgz(secret, Buffer.from(initVect, 'hex'))
pipeline(
process.stdin,
decgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
)
@loige86
// decgz.js - CLI to gunzip and decrypt (from stdin to stdout)
const { pipeline } = require('stream')
const { createDecgz } = require('./encgz-stream')
const [, , secret, initVect] = process.argv
const decgz = createDecgz(secret, Buffer.from(initVect, 'hex'))
pipeline(
process.stdin,
decgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
)
@loige86
// decgz.js - CLI to gunzip and decrypt (from stdin to stdout)
const { pipeline } = require('stream')
const { createDecgz } = require('./encgz-stream')
const [, , secret, initVect] = process.argv
const decgz = createDecgz(secret, Buffer.from(initVect, 'hex'))
pipeline(
process.stdin,
decgz,
process.stdout,
function onEnd (err) {
if (err) {
console.error(`Error: ${err}`)
process.exit(1)
}
}
)
@loige86
@loige87
stream.finished(streams, callback)
Get notified when a stream is not readable, writable anymore - Node.js 10+
// finished can be promisified!
const finished = util.promisify(stream.finished)
const rs = fs.createReadStream('archive.tar')
async function run() {
await finished(rs)
console.log('Stream is done reading.')
}
run().catch(console.error)
rs.resume() // start & drain the stream
@loige88
stream.finished(streams, callback)
Get notified when a stream is not readable, writable anymore - Node.js 10+
// finished can be promisified!
const finished = util.promisify(stream.finished)
const rs = fs.createReadStream('archive.tar')
async function run() {
await finished(rs)
console.log('Stream is done reading.')
}
run().catch(console.error)
rs.resume() // start & drain the stream
@loige88
stream.finished(streams, callback)
Get notified when a stream is not readable, writable anymore - Node.js 10+
// finished can be promisified!
const finished = util.promisify(stream.finished)
const rs = fs.createReadStream('archive.tar')
async function run() {
await finished(rs)
console.log('Stream is done reading.')
}
run().catch(console.error)
rs.resume() // start & drain the stream
@loige88
readable-stream - 
Npm package that contains the latest version of Node.js stream library.
It also makes Node.js streams compatible with the browser (can be used with
Webpack and Broswserify)
npm.im/readable-stream
@loige
* yeah, the name is misleading. The package offers all the functionalities in the official 'stream'
package, not just readable streams.
*
89
04. WRITING CUSTOM   04. WRITING CUSTOM   
        STREAMS        STREAMS
@loige90
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige91
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige91
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige91
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige91
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige91
// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
return emojis[index] + ' ' + getEmojiDescription(index)
}
class EmojiStream extends Readable {
constructor (options) {
super(options)
this._index = 0
}
_read () {
if (this._index >= emojis.length) {
return this.push(null)
}
return this.push(getMessage(this._index++))
}
}
module.exports = EmojiStream
@loige91
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
_flush () {
// in case there's buffered data
// that still have to be pushed
}
}
module.exports = Uppercasify
@loige92
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
_flush () {
// in case there's buffered data
// that still have to be pushed
}
}
module.exports = Uppercasify
@loige92
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
_flush () {
// in case there's buffered data
// that still have to be pushed
}
}
module.exports = Uppercasify
@loige92
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
_flush () {
// in case there's buffered data
// that still have to be pushed
}
}
module.exports = Uppercasify
@loige92
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
_flush () {
// in case there's buffered data
// that still have to be pushed
}
}
module.exports = Uppercasify
@loige92
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
_flush () {
// in case there's buffered data
// that still have to be pushed
}
}
module.exports = Uppercasify
@loige92
// uppercasify.js (custom transform stream)
const { Transform } = require('readable-stream')
class Uppercasify extends Transform {
_transform (chunk, encoding, done) {
this.push(chunk.toString().toUpperCase())
done()
}
_flush () {
// in case there's buffered data
// that still have to be pushed
}
}
module.exports = Uppercasify
@loige92
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
constructor (target, tag = 'p', options) {
super(options)
this._target = target
this._tag = tag
}
_write (chunk, encoding, done) {
const elem = document.createElement(this._tag)
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
this._target.appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige93
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
constructor (target, tag = 'p', options) {
super(options)
this._target = target
this._tag = tag
}
_write (chunk, encoding, done) {
const elem = document.createElement(this._tag)
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
this._target.appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige93
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
constructor (target, tag = 'p', options) {
super(options)
this._target = target
this._tag = tag
}
_write (chunk, encoding, done) {
const elem = document.createElement(this._tag)
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
this._target.appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige93
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
constructor (target, tag = 'p', options) {
super(options)
this._target = target
this._tag = tag
}
_write (chunk, encoding, done) {
const elem = document.createElement(this._tag)
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
this._target.appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige93
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
constructor (target, tag = 'p', options) {
super(options)
this._target = target
this._tag = tag
}
_write (chunk, encoding, done) {
const elem = document.createElement(this._tag)
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
this._target.appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige93
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
constructor (target, tag = 'p', options) {
super(options)
this._target = target
this._tag = tag
}
_write (chunk, encoding, done) {
const elem = document.createElement(this._tag)
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
this._target.appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige93
// dom-append.js (custom writable stream)
const { Writable } = require('readable-stream')
class DOMAppend extends Writable {
constructor (target, tag = 'p', options) {
super(options)
this._target = target
this._tag = tag
}
_write (chunk, encoding, done) {
const elem = document.createElement(this._tag)
const content = document.createTextNode(chunk.toString())
elem.appendChild(content)
this._target.appendChild(elem)
done()
}
}
module.exports = DOMAppend
@loige93
05. STREAMS IN THE     05. STREAMS IN THE     
        BROWSER        BROWSER
@loige94
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const list = document.getElementById('list')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend(list, 'li')
emoji
.pipe(uppercasify)
.pipe(append)
@loige95
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const list = document.getElementById('list')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend(list, 'li')
emoji
.pipe(uppercasify)
.pipe(append)
@loige95
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const list = document.getElementById('list')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend(list, 'li')
emoji
.pipe(uppercasify)
.pipe(append)
@loige95
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const list = document.getElementById('list')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend(list, 'li')
emoji
.pipe(uppercasify)
.pipe(append)
@loige95
// browser/app.js
const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')
const list = document.getElementById('list')
const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend(list, 'li')
emoji
.pipe(uppercasify)
.pipe(append)
@loige95
npm i --save-dev webpack webpack-cli
node_modules/.bin/webpack src/browser/app.js
# creates dist/main.js
mv dist/main.js src/browser/app-bundle.js
@loige
Let's use webpack to build this app for the browser
96
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1, shrink-to-fit=no"
/>
<title>Streams in the browser!</title>
</head>
<body>
<ul id="list"></ul>
<script src="app.bundle.js"></script>
</body>
</html> @loige
Finally let's create an index.html
97
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1, shrink-to-fit=no"
/>
<title>Streams in the browser!</title>
</head>
<body>
<ul id="list"></ul>
<script src="app.bundle.js"></script>
</body>
</html> @loige
Finally let's create an index.html
97
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1, shrink-to-fit=no"
/>
<title>Streams in the browser!</title>
</head>
<body>
<ul id="list"></ul>
<script src="app.bundle.js"></script>
</body>
</html> @loige
Finally let's create an index.html
97
@loige98
@loige98
06. CLOSING06. CLOSING
@loige99
Streams have low memory footprint
Process data as soon as it's available
Composition through pipelines
Define your logic as transform streams
Readable and writable streams are good abstractions for input &
output
You can swap them as needed without having to change the logic
@loige
TLDR;TLDR;
100
IF YOU WANT TO LEARN (EVEN) MOAR IF YOU WANT TO LEARN (EVEN) MOAR 
ABOUT STREAMS...ABOUT STREAMS...
nodejs.org/api/stream.html
github.com/substack/stream-handbook
@loige101
@loige102
CREDITSCREDITS
Cover Photo by   on 
 for the amazing St. Patrick emoji art
The internet for the memes! :D
WeRoad Unsplash
emojiart.org
@loige
SPECIAL THANKSSPECIAL THANKS
,  ,  ,  , @mariocasciaro @machine_person @Podgeypoos79 @katavic_d @UrsoLuca
THANKS!THANKS!
loige.link/streams-dub
103

More Related Content

Similar to It's about time to embrace Node.js streams

It’s about time to embrace Streams (Node Ukraine 2019)
 It’s about time to embrace Streams (Node Ukraine 2019) It’s about time to embrace Streams (Node Ukraine 2019)
It’s about time to embrace Streams (Node Ukraine 2019)Luciano Mammino
 
"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano Mammino"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano MamminoJulia Cherniak
 
It’s about time to embrace Node.js Streams - MancJS
It’s about time to embrace Node.js Streams - MancJSIt’s about time to embrace Node.js Streams - MancJS
It’s about time to embrace Node.js Streams - MancJSLuciano Mammino
 
Introduction to Akka Streams [Part-II]
Introduction to Akka Streams [Part-II]Introduction to Akka Streams [Part-II]
Introduction to Akka Streams [Part-II]Knoldus Inc.
 
루비가 얼랭에 빠진 날
루비가 얼랭에 빠진 날루비가 얼랭에 빠진 날
루비가 얼랭에 빠진 날Sukjoon Kim
 
FunctionalJS - May 2014 - Streams
FunctionalJS - May 2014 - StreamsFunctionalJS - May 2014 - Streams
FunctionalJS - May 2014 - Streamsdarach
 
The Atmosphere Framework
The Atmosphere FrameworkThe Atmosphere Framework
The Atmosphere Frameworkjfarcand
 
p4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docx
p4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docxp4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docx
p4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docxsmile790243
 
Streams of information - Chicago crystal language monthly meetup
Streams of information - Chicago crystal language monthly meetupStreams of information - Chicago crystal language monthly meetup
Streams of information - Chicago crystal language monthly meetupBrian Cardiff
 
Hopping in clouds - phpuk 17
Hopping in clouds - phpuk 17Hopping in clouds - phpuk 17
Hopping in clouds - phpuk 17Michele Orselli
 
Asynchronous programming patterns in Perl
Asynchronous programming patterns in PerlAsynchronous programming patterns in Perl
Asynchronous programming patterns in Perldeepfountainconsulting
 
Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#Paulo Morgado
 
Using Ember to Make a Bazillion Dollars
Using Ember to Make a Bazillion DollarsUsing Ember to Make a Bazillion Dollars
Using Ember to Make a Bazillion DollarsMike Pack
 
Have You Seen Spring Lately?
Have You Seen Spring Lately?Have You Seen Spring Lately?
Have You Seen Spring Lately?Joshua Long
 
Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...
Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...
Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...apidays
 
It’s about time to embrace Node.js Streams - Austin Node.js meetup
It’s about time to embrace Node.js Streams - Austin Node.js meetupIt’s about time to embrace Node.js Streams - Austin Node.js meetup
It’s about time to embrace Node.js Streams - Austin Node.js meetupLuciano Mammino
 

Similar to It's about time to embrace Node.js streams (20)

It’s about time to embrace Streams (Node Ukraine 2019)
 It’s about time to embrace Streams (Node Ukraine 2019) It’s about time to embrace Streams (Node Ukraine 2019)
It’s about time to embrace Streams (Node Ukraine 2019)
 
"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano Mammino"It’s about time to embrace Streams" Luciano Mammino
"It’s about time to embrace Streams" Luciano Mammino
 
Server Side? Swift
Server Side? SwiftServer Side? Swift
Server Side? Swift
 
It’s about time to embrace Node.js Streams - MancJS
It’s about time to embrace Node.js Streams - MancJSIt’s about time to embrace Node.js Streams - MancJS
It’s about time to embrace Node.js Streams - MancJS
 
Introduction to Akka Streams [Part-II]
Introduction to Akka Streams [Part-II]Introduction to Akka Streams [Part-II]
Introduction to Akka Streams [Part-II]
 
루비가 얼랭에 빠진 날
루비가 얼랭에 빠진 날루비가 얼랭에 빠진 날
루비가 얼랭에 빠진 날
 
FunctionalJS - May 2014 - Streams
FunctionalJS - May 2014 - StreamsFunctionalJS - May 2014 - Streams
FunctionalJS - May 2014 - Streams
 
The Atmosphere Framework
The Atmosphere FrameworkThe Atmosphere Framework
The Atmosphere Framework
 
Sprockets
SprocketsSprockets
Sprockets
 
p4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docx
p4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docxp4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docx
p4-start.cppp4-start.cpp @file  p4-start.c    @author Y.docx
 
Streams of information - Chicago crystal language monthly meetup
Streams of information - Chicago crystal language monthly meetupStreams of information - Chicago crystal language monthly meetup
Streams of information - Chicago crystal language monthly meetup
 
Hopping in clouds - phpuk 17
Hopping in clouds - phpuk 17Hopping in clouds - phpuk 17
Hopping in clouds - phpuk 17
 
Streams in Node.js
Streams in Node.jsStreams in Node.js
Streams in Node.js
 
File System.pptx
File System.pptxFile System.pptx
File System.pptx
 
Asynchronous programming patterns in Perl
Asynchronous programming patterns in PerlAsynchronous programming patterns in Perl
Asynchronous programming patterns in Perl
 
Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#Tuga IT 2018 Summer Edition - The Future of C#
Tuga IT 2018 Summer Edition - The Future of C#
 
Using Ember to Make a Bazillion Dollars
Using Ember to Make a Bazillion DollarsUsing Ember to Make a Bazillion Dollars
Using Ember to Make a Bazillion Dollars
 
Have You Seen Spring Lately?
Have You Seen Spring Lately?Have You Seen Spring Lately?
Have You Seen Spring Lately?
 
Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...
Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...
Apidays Paris 2023 - Forget TypeScript, Choose Rust to build Robust, Fast and...
 
It’s about time to embrace Node.js Streams - Austin Node.js meetup
It’s about time to embrace Node.js Streams - Austin Node.js meetupIt’s about time to embrace Node.js Streams - Austin Node.js meetup
It’s about time to embrace Node.js Streams - Austin Node.js meetup
 

More from Luciano Mammino

Did you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJSDid you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJSLuciano Mammino
 
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...Luciano Mammino
 
Building an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS MilanoBuilding an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS MilanoLuciano Mammino
 
From Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiperFrom Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiperLuciano Mammino
 
Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!Luciano Mammino
 
Everything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLsEverything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLsLuciano Mammino
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance ComputingLuciano Mammino
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance ComputingLuciano Mammino
 
JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022Luciano Mammino
 
Building an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & AirtableBuilding an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & AirtableLuciano Mammino
 
Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀Luciano Mammino
 
A look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust DublinA look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust DublinLuciano Mammino
 
Node.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community VijayawadaNode.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community VijayawadaLuciano Mammino
 
A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)Luciano Mammino
 
AWS Observability Made Simple
AWS Observability Made SimpleAWS Observability Made Simple
AWS Observability Made SimpleLuciano Mammino
 
Semplificare l'observability per progetti Serverless
Semplificare l'observability per progetti ServerlessSemplificare l'observability per progetti Serverless
Semplificare l'observability per progetti ServerlessLuciano Mammino
 
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021Luciano Mammino
 
Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021Luciano Mammino
 

More from Luciano Mammino (20)

Did you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJSDid you know JavaScript has iterators? DublinJS
Did you know JavaScript has iterators? DublinJS
 
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
What I learned by solving 50 Advent of Code challenges in Rust - RustNation U...
 
Building an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS MilanoBuilding an invite-only microsite with Next.js & Airtable - ReactJS Milano
Building an invite-only microsite with Next.js & Airtable - ReactJS Milano
 
From Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiperFrom Node.js to Design Patterns - BuildPiper
From Node.js to Design Patterns - BuildPiper
 
Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!Let's build a 0-cost invite-only website with Next.js and Airtable!
Let's build a 0-cost invite-only website with Next.js and Airtable!
 
Everything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLsEverything I know about S3 pre-signed URLs
Everything I know about S3 pre-signed URLs
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance Computing
 
Serverless for High Performance Computing
Serverless for High Performance ComputingServerless for High Performance Computing
Serverless for High Performance Computing
 
JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022JavaScript Iteration Protocols - Workshop NodeConf EU 2022
JavaScript Iteration Protocols - Workshop NodeConf EU 2022
 
Building an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & AirtableBuilding an invite-only microsite with Next.js & Airtable
Building an invite-only microsite with Next.js & Airtable
 
Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀Let's take the monolith to the cloud 🚀
Let's take the monolith to the cloud 🚀
 
A look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust DublinA look inside the European Covid Green Certificate - Rust Dublin
A look inside the European Covid Green Certificate - Rust Dublin
 
Monoliths to the cloud!
Monoliths to the cloud!Monoliths to the cloud!
Monoliths to the cloud!
 
The senior dev
The senior devThe senior dev
The senior dev
 
Node.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community VijayawadaNode.js: scalability tips - Azure Dev Community Vijayawada
Node.js: scalability tips - Azure Dev Community Vijayawada
 
A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)A look inside the European Covid Green Certificate (Codemotion 2021)
A look inside the European Covid Green Certificate (Codemotion 2021)
 
AWS Observability Made Simple
AWS Observability Made SimpleAWS Observability Made Simple
AWS Observability Made Simple
 
Semplificare l'observability per progetti Serverless
Semplificare l'observability per progetti ServerlessSemplificare l'observability per progetti Serverless
Semplificare l'observability per progetti Serverless
 
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
Finding a lost song with Node.js and async iterators - NodeConf Remote 2021
 
Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021Finding a lost song with Node.js and async iterators - EnterJS 2021
Finding a lost song with Node.js and async iterators - EnterJS 2021
 

Recently uploaded

Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...Thierry Lestable
 
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...CzechDreamin
 
Custom Approval Process: A New Perspective, Pavel Hrbacek & Anindya Halder
Custom Approval Process: A New Perspective, Pavel Hrbacek & Anindya HalderCustom Approval Process: A New Perspective, Pavel Hrbacek & Anindya Halder
Custom Approval Process: A New Perspective, Pavel Hrbacek & Anindya HalderCzechDreamin
 
Powerful Start- the Key to Project Success, Barbara Laskowska
Powerful Start- the Key to Project Success, Barbara LaskowskaPowerful Start- the Key to Project Success, Barbara Laskowska
Powerful Start- the Key to Project Success, Barbara LaskowskaCzechDreamin
 
IoT Analytics Company Presentation May 2024
IoT Analytics Company Presentation May 2024IoT Analytics Company Presentation May 2024
IoT Analytics Company Presentation May 2024IoTAnalytics
 
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo DiehlFuture Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo DiehlPeter Udo Diehl
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
 
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀DianaGray10
 
Introduction to Open Source RAG and RAG Evaluation
Introduction to Open Source RAG and RAG EvaluationIntroduction to Open Source RAG and RAG Evaluation
Introduction to Open Source RAG and RAG EvaluationZilliz
 
SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...
SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...
SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...CzechDreamin
 
UiPath Test Automation using UiPath Test Suite series, part 2
UiPath Test Automation using UiPath Test Suite series, part 2UiPath Test Automation using UiPath Test Suite series, part 2
UiPath Test Automation using UiPath Test Suite series, part 2DianaGray10
 
Unpacking Value Delivery - Agile Oxford Meetup - May 2024.pptx
Unpacking Value Delivery - Agile Oxford Meetup - May 2024.pptxUnpacking Value Delivery - Agile Oxford Meetup - May 2024.pptx
Unpacking Value Delivery - Agile Oxford Meetup - May 2024.pptxDavid Michel
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...Product School
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
 
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...Product School
 
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
 
Knowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and backKnowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and backElena Simperl
 
IESVE for Early Stage Design and Planning
IESVE for Early Stage Design and PlanningIESVE for Early Stage Design and Planning
IESVE for Early Stage Design and PlanningIES VE
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Alison B. Lowndes
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesThousandEyes
 

Recently uploaded (20)

Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
 
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
Behind the Scenes From the Manager's Chair: Decoding the Secrets of Successfu...
 
Custom Approval Process: A New Perspective, Pavel Hrbacek & Anindya Halder
Custom Approval Process: A New Perspective, Pavel Hrbacek & Anindya HalderCustom Approval Process: A New Perspective, Pavel Hrbacek & Anindya Halder
Custom Approval Process: A New Perspective, Pavel Hrbacek & Anindya Halder
 
Powerful Start- the Key to Project Success, Barbara Laskowska
Powerful Start- the Key to Project Success, Barbara LaskowskaPowerful Start- the Key to Project Success, Barbara Laskowska
Powerful Start- the Key to Project Success, Barbara Laskowska
 
IoT Analytics Company Presentation May 2024
IoT Analytics Company Presentation May 2024IoT Analytics Company Presentation May 2024
IoT Analytics Company Presentation May 2024
 
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo DiehlFuture Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
 
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
 
Introduction to Open Source RAG and RAG Evaluation
Introduction to Open Source RAG and RAG EvaluationIntroduction to Open Source RAG and RAG Evaluation
Introduction to Open Source RAG and RAG Evaluation
 
SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...
SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...
SOQL 201 for Admins & Developers: Slice & Dice Your Org’s Data With Aggregate...
 
UiPath Test Automation using UiPath Test Suite series, part 2
UiPath Test Automation using UiPath Test Suite series, part 2UiPath Test Automation using UiPath Test Suite series, part 2
UiPath Test Automation using UiPath Test Suite series, part 2
 
Unpacking Value Delivery - Agile Oxford Meetup - May 2024.pptx
Unpacking Value Delivery - Agile Oxford Meetup - May 2024.pptxUnpacking Value Delivery - Agile Oxford Meetup - May 2024.pptx
Unpacking Value Delivery - Agile Oxford Meetup - May 2024.pptx
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
 
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...
 
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
 
Knowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and backKnowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and back
 
IESVE for Early Stage Design and Planning
IESVE for Early Stage Design and PlanningIESVE for Early Stage Design and Planning
IESVE for Early Stage Design and Planning
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
 

It's about time to embrace Node.js streams