Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

"It’s about time to embrace Streams" Luciano Mammino

244 views

Published on

With very practical examples we'll learn how streams work in Node.js & the Browser. With streams, you will be able to write elegant JavaScript applications that are much more composable and memory efficient!

Published in: Technology
  • Be the first to comment

  • Be the first to like this

"It’s about time to embrace Streams" Luciano Mammino

  1. 1. K Y I V   2 0 1 9 Luciano Mammino (@loige) IT’S ABOUT TIME TOIT’S ABOUT TIME TO EMBRACE STREAMSEMBRACE STREAMS    loige.link/streams-kyiv May 18th 1
  2. 2. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 @loige2
  3. 3. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 const { readFileSync, writeFileSync } = require('fs') // buffer-copy.js1 2 3 4 5 6 7 const [,, src, dest] = process.argv8 9 // read entire file content10 const content = readFileSync(src)11 12 // write that content somewhere else13 writeFileSync(dest, content)14 @loige2
  4. 4. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 const { readFileSync, writeFileSync } = require('fs') // buffer-copy.js1 2 3 4 5 6 7 const [,, src, dest] = process.argv8 9 // read entire file content10 const content = readFileSync(src)11 12 // write that content somewhere else13 writeFileSync(dest, content)14 // read entire file content const content = readFileSync(src) // buffer-copy.js1 2 const {3 readFileSync,4 writeFileSync5 } = require('fs')6 7 const [,, src, dest] = process.argv8 9 10 11 12 // write that content somewhere else13 writeFileSync(dest, content)14 @loige2
  5. 5. // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv // read entire file content const content = readFileSync(src) // write that content somewhere else writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 const { readFileSync, writeFileSync } = require('fs') // buffer-copy.js1 2 3 4 5 6 7 const [,, src, dest] = process.argv8 9 // read entire file content10 const content = readFileSync(src)11 12 // write that content somewhere else13 writeFileSync(dest, content)14 // read entire file content const content = readFileSync(src) // buffer-copy.js1 2 const {3 readFileSync,4 writeFileSync5 } = require('fs')6 7 const [,, src, dest] = process.argv8 9 10 11 12 // write that content somewhere else13 writeFileSync(dest, content)14 // write that content somewhere else writeFileSync(dest, content) // buffer-copy.js1 2 const {3 readFileSync,4 writeFileSync5 } = require('fs')6 7 const [,, src, dest] = process.argv8 9 // read entire file content10 const content = readFileSync(src)11 12 13 14 @loige2
  6. 6. @loige3
  7. 7. WE DO THIS ALL THE TIMEWE DO THIS ALL THE TIME @loige3
  8. 8. WE DO THIS ALL THE TIMEWE DO THIS ALL THE TIME AND IT'S OKAND IT'S OK @loige3
  9. 9. WE DO THIS ALL THE TIMEWE DO THIS ALL THE TIME AND IT'S OKAND IT'S OK BUT SOMETIMES ...BUT SOMETIMES ... @loige3
  10. 10. @loige  ERR_FS_FILE_TOO_LARGE!  ERR_FS_FILE_TOO_LARGE!  File size is greater than possible Buffer 4
  11. 11. BUT WHY?BUT WHY? @loige5
  12. 12. IF BYTES IF BYTES WEREWERE BLOCKS... BLOCKS...@loige 6
  13. 13. MARIO CAN LIFTMARIO CAN LIFT FEW BLOCKSFEW BLOCKS @loige 7
  14. 14. BUT NOT TOO MANY...BUT NOT TOO MANY...@loige ?! 8
  15. 15. WHAT CAN WE DO IF WE HAVE TOWHAT CAN WE DO IF WE HAVE TO MOVE MANY BLOCKS?MOVE MANY BLOCKS? @loige9
  16. 16. WE CAN MOVE THEM ONE BY ONE!WE CAN MOVE THEM ONE BY ONE! @loige we stream them... 10
  17. 17. 11
  18. 18.  HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11
  19. 19.  HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11
  20. 20.  HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11
  21. 21.  HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! 11
  22. 22.  HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect 11
  23. 23.  HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect Blog:  Twitter:  GitHub:    loige.co @loige @lmammino 11
  24. 24.  HELLO, I AM LUCIANO! HELLO, I AM LUCIANO! Cloud Architect Blog:  Twitter:  GitHub:    loige.co @loige @lmammino 11
  25. 25. code: loige.link/streams-examples loige.link/streams-kyiv 12
  26. 26. 01. BUFFERS VS01. BUFFERS VS           STREAMS        STREAMS @loige13
  27. 27. BUFFERBUFFER: DATA STRUCTURE TO STORE AND: DATA STRUCTURE TO STORE AND TRANSFER ARBITRARY BINARY DATATRANSFER ARBITRARY BINARY DATA @loige *Note that this is loading all the content of the file in memory * 14
  28. 28. STREAMSTREAM: ABSTRACT INTERFACE FOR: ABSTRACT INTERFACE FOR WORKING WITH STREAMING DATAWORKING WITH STREAMING DATA @loige *It does not load all the data straight away * 15
  29. 29. FILE COPY: FILE COPY: THE BUFFER WAYTHE BUFFER WAY @loige // buffer-copy.js const { readFileSync, writeFileSync } = require('fs') const [,, src, dest] = process.argv const content = readFileSync(src) writeFileSync(dest, content) 1 2 3 4 5 6 7 8 9 10 16
  30. 30. FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 @loige * Careful: this implementation is not optimal * 17
  31. 31. FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 createReadStream, createWriteStream // stream-copy.js1 2 const {3 4 5 } = require('fs')6 7 const [,, src, dest] = process.argv8 const srcStream = createReadStream(src)9 const destStream = createWriteStream(dest)10 srcStream.on('data', (data) => destStream.write(data))11 @loige * Careful: this implementation is not optimal * 17
  32. 32. FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 createReadStream, createWriteStream // stream-copy.js1 2 const {3 4 5 } = require('fs')6 7 const [,, src, dest] = process.argv8 const srcStream = createReadStream(src)9 const destStream = createWriteStream(dest)10 srcStream.on('data', (data) => destStream.write(data))11 const srcStream = createReadStream(src) const destStream = createWriteStream(dest) // stream-copy.js1 2 const {3 createReadStream,4 createWriteStream5 } = require('fs')6 7 const [,, src, dest] = process.argv8 9 10 srcStream.on('data', (data) => destStream.write(data))11 @loige * Careful: this implementation is not optimal * 17
  33. 33. FILE COPY: FILE COPY: THE STREAM WAYTHE STREAM WAY // stream-copy.js const { createReadStream, createWriteStream } = require('fs') const [,, src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', (data) => destStream.write(data)) 1 2 3 4 5 6 7 8 9 10 11 createReadStream, createWriteStream // stream-copy.js1 2 const {3 4 5 } = require('fs')6 7 const [,, src, dest] = process.argv8 const srcStream = createReadStream(src)9 const destStream = createWriteStream(dest)10 srcStream.on('data', (data) => destStream.write(data))11 const srcStream = createReadStream(src) const destStream = createWriteStream(dest) // stream-copy.js1 2 const {3 createReadStream,4 createWriteStream5 } = require('fs')6 7 const [,, src, dest] = process.argv8 9 10 srcStream.on('data', (data) => destStream.write(data))11 srcStream.on('data', (data) => destStream.write(data)) // stream-copy.js1 2 const {3 createReadStream,4 createWriteStream5 } = require('fs')6 7 const [,, src, dest] = process.argv8 const srcStream = createReadStream(src)9 const destStream = createWriteStream(dest)10 11 @loige * Careful: this implementation is not optimal * 17
  34. 34. MEMORY COMPARISON (~600MB FILE)MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk buffer­copy.js assets/poster.psd ~/Downloads/poster.psd @loige18
  35. 35. MEMORY COMPARISON (~600MB FILE)MEMORY COMPARISON (~600MB FILE) node ­­inspect­brk stream­copy.js assets/poster.psd ~/Downloads/poster.psd @loige19
  36. 36. LET'S TRY WITH A BIG FILE (~10GB)LET'S TRY WITH A BIG FILE (~10GB) @loige20
  37. 37. LET'S TRY WITH A BIG FILE (~10GB)LET'S TRY WITH A BIG FILE (~10GB) node ­­inspect­brk stream­copy.js assets/the­matrix­hd.mkv ~/Downloads/the­matrix­hd.mkv @loige21
  38. 38.  STREAMS VS BUFFERS  STREAMS VS BUFFERS  Streams keep a low memory footprint even with large amounts of data    Streams allows you to process data as soon as it arrives @loige22
  39. 39. 03. STREAM TYPES03. STREAM TYPES          & APIS       & APIS @loige23
  40. 40. ALL STREAMS ARE ALL STREAMS ARE EVENT EMITTERSEVENT EMITTERS A stream instance is an object that emits events when its internal state changes, for instance: s.on('readable', () => {}) // ready to be consumed s.on('data', (chunk) => {}) // new data is available s.on('error', (err) => {}) // some error happened s.on('end', () => {}) // no more data available The events available depend from the type of stream @loige24
  41. 41. READABLEREADABLE STREAMS STREAMS A readable stream represents a source from which data is consumed. Examples: fs readStream process.stdin HTTP response (client-side) HTTP request (server-side) AWS S3 GetObject (data field) It supports two modes for data consumption: flowing and paused (or non- flowing) mode. @loige25
  42. 42. READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. @loige26
  43. 43. @loige 1 2 3 Source data Readable stream in flowing mode data listener READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 27
  44. 44. @loige 12 3 Source data Readable stream in flowing mode Read data listener READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 28
  45. 45. @loige 12 3 Source data Readable stream in flowing mode data listener data READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 29
  46. 46. @loige 2 3 Source data Readable stream in flowing mode data listener Read READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 30
  47. 47. @loige 2 3 Source data Readable stream in flowing mode data listener data READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 31
  48. 48. @loige 3 Source data Readable stream in flowing mode data listener Read READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 32
  49. 49. @loige 3 Source data Readable stream in flowing mode data listener data READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 33
  50. 50. @loige Source data Readable stream in flowing mode Read data listener (end) READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 34
  51. 51. @loige Source data Readable stream in flowing mode data listener end (end) When no more data is available, end is emitted. READABLE STREAMSREADABLE STREAMS Data is read from source automatically and chunks are emitted as soon as they are available. 35
  52. 52. // count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 @loige36
  53. 53. // count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 @loige36
  54. 54. // count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 @loige36
  55. 55. // count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 let counter = 0 // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 9 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 @loige36
  56. 56. // count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 let counter = 0 // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 9 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 file.on('data', chunk => { }) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 @loige36
  57. 57. // count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 let counter = 0 // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 9 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 file.on('data', chunk => { }) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 for (let char of chunk.toString('utf8')) { } // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 12 if (emojis.includes(char)) {13 counter++14 }15 16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 @loige36
  58. 58. // count-emojis-flowing.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 file.on('data', chunk => { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } }) file.on('end', () => console.log(`Found ${counter} emojis`)) file.on('error', err => console.error(`Error reading file: ${err}`)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const { EMOJI_MAP } = require('emoji') // from npm // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 const file = createReadStream(process.argv[2]) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 let counter = 0 // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 9 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 file.on('data', chunk => { }) // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 11 for (let char of chunk.toString('utf8')) {12 if (emojis.includes(char)) {13 counter++14 }15 }16 17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 for (let char of chunk.toString('utf8')) { } // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 12 if (emojis.includes(char)) {13 counter++14 }15 16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 if (emojis.includes(char)) { counter++ } // count-emojis-flowing.js1 2 const { createReadStream } = require('fs')3 const { EMOJI_MAP } = require('emoji') // from npm4 5 const emojis = Object.keys(EMOJI_MAP)6 7 const file = createReadStream(process.argv[2])8 let counter = 09 10 file.on('data', chunk => {11 for (let char of chunk.toString('utf8')) {12 13 14 15 }16 })17 file.on('end', () => console.log(`Found ${counter} emojis`))18 file.on('error', err => console.error(`Error reading file: ${err}`))19 @loige36
  59. 59. loige.link/up-emojiart @loige37
  60. 60. READABLE STREAMS AREREADABLE STREAMS ARE ALSO ALSO ASYNC ITERATORSASYNC ITERATORS   ((NODE.JS 10+)NODE.JS 10+) @loige38
  61. 61. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 @loige39
  62. 62. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 for await (let chunk of file) { } // count-emojis-async-iterator.js1 const { createReadStream } = require('fs')2 const { EMOJI_MAP } = require('emoji') // from npm3 4 async function main () {5 const emojis = Object.keys(EMOJI_MAP)6 const file = createReadStream(process.argv[2])7 let counter = 08 9 10 for (let char of chunk.toString('utf8')) {11 if (emojis.includes(char)) {12 counter++13 }14 }15 16 17 console.log(`Found ${counter} emojis`)18 }19 20 main()21 @loige39
  63. 63. // count-emojis-async-iterator.js const { createReadStream } = require('fs') const { EMOJI_MAP } = require('emoji') // from npm async function main () { const emojis = Object.keys(EMOJI_MAP) const file = createReadStream(process.argv[2]) let counter = 0 for await (let chunk of file) { for (let char of chunk.toString('utf8')) { if (emojis.includes(char)) { counter++ } } } console.log(`Found ${counter} emojis`) } main() 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 for await (let chunk of file) { } // count-emojis-async-iterator.js1 const { createReadStream } = require('fs')2 const { EMOJI_MAP } = require('emoji') // from npm3 4 async function main () {5 const emojis = Object.keys(EMOJI_MAP)6 const file = createReadStream(process.argv[2])7 let counter = 08 9 10 for (let char of chunk.toString('utf8')) {11 if (emojis.includes(char)) {12 counter++13 }14 }15 16 17 console.log(`Found ${counter} emojis`)18 }19 20 main()21 async function main () { } // count-emojis-async-iterator.js1 const { createReadStream } = require('fs')2 const { EMOJI_MAP } = require('emoji') // from npm3 4 5 const emojis = Object.keys(EMOJI_MAP)6 const file = createReadStream(process.argv[2])7 let counter = 08 9 for await (let chunk of file) {10 for (let char of chunk.toString('utf8')) {11 if (emojis.includes(char)) {12 counter++13 }14 }15 }16 17 console.log(`Found ${counter} emojis`)18 19 20 main()21 @loige39
  64. 64. WRITABLEWRITABLE STREAMS STREAMS A writable stream is an abstraction that allows to write data over a destination   Examples: fs writeStream process.stdout, process.stderr HTTP request (client-side) HTTP response (server-side) AWS S3 PutObject (body parameter) @loige40
  65. 65. // writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 @loige41
  66. 66. // writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const req = http.request( ) // writable-http-request.js1 const http = require('http')2 3 4 {5 hostname: 'enx6b07hdu6cs.x.pipedream.net',6 method: 'POST'7 },8 resp => {9 console.log(`Server responded with "${resp.statusCode}"`)10 }11 12 13 req.on('finish', () => console.log('request sent'))14 req.on('close', () => console.log('Connection closed'))15 req.on('error', err => console.error(`Request failed: ${err}`))16 17 req.write('writing some content...n')18 req.end('last write & close the stream')19 @loige41
  67. 67. // writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const req = http.request( ) // writable-http-request.js1 const http = require('http')2 3 4 {5 hostname: 'enx6b07hdu6cs.x.pipedream.net',6 method: 'POST'7 },8 resp => {9 console.log(`Server responded with "${resp.statusCode}"`)10 }11 12 13 req.on('finish', () => console.log('request sent'))14 req.on('close', () => console.log('Connection closed'))15 req.on('error', err => console.error(`Request failed: ${err}`))16 17 req.write('writing some content...n')18 req.end('last write & close the stream')19 req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) // writable-http-request.js1 const http = require('http')2 3 const req = http.request(4 {5 hostname: 'enx6b07hdu6cs.x.pipedream.net',6 method: 'POST'7 },8 resp => {9 console.log(`Server responded with "${resp.statusCode}"`)10 }11 )12 13 14 15 16 17 req.write('writing some content...n')18 req.end('last write & close the stream')19 @loige41
  68. 68. // writable-http-request.js const http = require('http') const req = http.request( { hostname: 'enx6b07hdu6cs.x.pipedream.net', method: 'POST' }, resp => { console.log(`Server responded with "${resp.statusCode}"`) } ) req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) req.write('writing some content...n') req.end('last write & close the stream') 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 const req = http.request( ) // writable-http-request.js1 const http = require('http')2 3 4 {5 hostname: 'enx6b07hdu6cs.x.pipedream.net',6 method: 'POST'7 },8 resp => {9 console.log(`Server responded with "${resp.statusCode}"`)10 }11 12 13 req.on('finish', () => console.log('request sent'))14 req.on('close', () => console.log('Connection closed'))15 req.on('error', err => console.error(`Request failed: ${err}`))16 17 req.write('writing some content...n')18 req.end('last write & close the stream')19 req.on('finish', () => console.log('request sent')) req.on('close', () => console.log('Connection closed')) req.on('error', err => console.error(`Request failed: ${err}`)) // writable-http-request.js1 const http = require('http')2 3 const req = http.request(4 {5 hostname: 'enx6b07hdu6cs.x.pipedream.net',6 method: 'POST'7 },8 resp => {9 console.log(`Server responded with "${resp.statusCode}"`)10 }11 )12 13 14 15 16 17 req.write('writing some content...n')18 req.end('last write & close the stream')19 req.write('writing some content...n') req.end('last write & close the stream') // writable-http-request.js1 const http = require('http')2 3 const req = http.request(4 {5 hostname: 'enx6b07hdu6cs.x.pipedream.net',6 method: 'POST'7 },8 resp => {9 console.log(`Server responded with "${resp.statusCode}"`)10 }11 )12 13 req.on('finish', () => console.log('request sent'))14 req.on('close', () => console.log('Connection closed'))15 req.on('error', err => console.error(`Request failed: ${err}`))16 17 18 19 @loige41
  69. 69. @loige42
  70. 70. loige.link/writable-http-req @loige43
  71. 71. BACKPRESSUREBACKPRESSURE When writing large amounts of data you should make sure you handle the stop write signal and the drain event   loige.link/backpressure @loige44
  72. 72. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 @loige45
  73. 73. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 @loige45
  74. 74. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 if (!canContinue) { } // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 16 })17 @loige45
  75. 75. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 if (!canContinue) { } // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 16 })17 srcStream.pause() // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 @loige45
  76. 76. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 if (!canContinue) { } // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 16 })17 srcStream.pause() // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 destStream.once('drain', () => srcStream.resume()) // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 15 }16 })17 @loige45
  77. 77. // stream-copy-safe.js const { createReadStream, createWriteStream } = require('fs') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { // we are overflowing the destination, we should pause srcStream.pause() // we will resume when the destination stream is drained destStream.once('drain', () => srcStream.resume()) } }) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 const canContinue = destStream.write(data) // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 if (!canContinue) { } // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 16 })17 srcStream.pause() // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 13 // we will resume when the destination stream is drained14 destStream.once('drain', () => srcStream.resume())15 }16 })17 destStream.once('drain', () => srcStream.resume()) // stream-copy-safe.js1 2 const { createReadStream, createWriteStream } = require('fs')3 4 const [, , src, dest] = process.argv5 const srcStream = createReadStream(src)6 const destStream = createWriteStream(dest)7 8 srcStream.on('data', data => {9 const canContinue = destStream.write(data)10 if (!canContinue) {11 // we are overflowing the destination, we should pause12 srcStream.pause()13 // we will resume when the destination stream is drained14 15 }16 })17 @loige45
  78. 78. OTHER TYPES OF STREAMOTHER TYPES OF STREAM Duplex Stream  streams that are both Readable and Writable.   (net.Socket)    Transform Stream  Duplex streams that can modify or transform the data as it is written and read.  (zlib.createGzip(), crypto.createCipheriv()) @loige46
  79. 79. ANATOMY OF A TRANSFORM STREAMANATOMY OF A TRANSFORM STREAM transform stream @loige47
  80. 80. ANATOMY OF A TRANSFORM STREAMANATOMY OF A TRANSFORM STREAM 1. write data transform stream (readable stream) @loige47
  81. 81. ANATOMY OF A TRANSFORM STREAMANATOMY OF A TRANSFORM STREAM 1. write data transform stream 2. transform the data (readable stream) @loige47
  82. 82. ANATOMY OF A TRANSFORM STREAMANATOMY OF A TRANSFORM STREAM 1. write data transform stream 3. read transformed data2. transform the data (readable stream) (writable stream) @loige47
  83. 83. GZIP EXAMPLEGZIP EXAMPLE 1. write data transform stream 3. read transformed data2. transform the data (readable stream) (writable stream) @loige48
  84. 84. GZIP EXAMPLEGZIP EXAMPLE 1. write data transform stream 3. read transformed data2. transform the data (readable stream) (writable stream) @loige Uncompressed data 48
  85. 85. GZIP EXAMPLEGZIP EXAMPLE 1. write data transform stream 3. read transformed data2. transform the data (readable stream) (writable stream) @loige Uncompressed data compress zlib.createGzip() 48
  86. 86. GZIP EXAMPLEGZIP EXAMPLE 1. write data transform stream 3. read transformed data2. transform the data (readable stream) (writable stream) @loige Uncompressed data Compressed data compress zlib.createGzip() 48
  87. 87. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable 49@loige
  88. 88. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data 49@loige
  89. 89. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() 49@loige
  90. 90. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data 49@loige
  91. 91. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() 49@loige
  92. 92. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() 49@loige (Backpressure)
  93. 93. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() 49@loige (Backpressure)
  94. 94. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() ⚡ drain 49@loige (Backpressure)
  95. 95. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() ⚡ drainresume() 49@loige (Backpressure)
  96. 96. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() ⚡ drainresume() 49@loige (Backpressure) (Backpressure)
  97. 97. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() ⚡ drainresume() pause() 49@loige (Backpressure) (Backpressure)
  98. 98. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() ⚡ drainresume() pause() ⚡ drain 49@loige (Backpressure) (Backpressure)
  99. 99. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() ⚡ drainresume() pause() ⚡ drainresume() 49@loige (Backpressure) (Backpressure)
  100. 100. HOW CAN WE USE TRANSFORM STREAMS?HOW CAN WE USE TRANSFORM STREAMS? Readable Transform Writable ⚡   data write() ⚡   data write() pause() ⚡ drainresume() pause() ⚡ drainresume() 49@loige (Backpressure) (Backpressure) You also have to handle end events and errors!
  101. 101. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige50
  102. 102. gzipStream.on('data', data => { const canContinue = destStream.write(data) if (!canContinue) { gzipStream.pause() destStream.once('drain', () => { gzipStream.resume() }) } }) gzipStream.on('end', () => { destStream.end() }) // ⚠ TODO: handle errors! // stream-copy-gzip.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream.on('data', data => { const canContinue = gzipStream.write(data) if (!canContinue) { srcStream.pause() gzipStream.once('drain', () => { srcStream.resume() }) } }) srcStream.on('end', () => { // check if there's buffered data left const remainingData = gzipStream.read() if (remainingData !== null) { destStream.write() } gzipStream.end() }) @loige50
  103. 103. 03. PIPE()03. PIPE() @loige51
  104. 104. readable .pipe(tranform1) .pipe(transform2) .pipe(transform3) .pipe(writable) readable.pipe(writableDest) @loige Connects a readable stream to a writable stream A transform stream can be used as a destination as well It returns the destination stream allowing for a chain of pipes 52
  105. 105. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 @loige53
  106. 106. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 srcStream .pipe(gzipStream) .pipe(destStream) // stream-copy-gzip-pipe.js1 2 const {3 createReadStream,4 createWriteStream5 } = require('fs')6 const { createGzip } = require('zlib')7 8 const [, , src, dest] = process.argv9 const srcStream = createReadStream(src)10 const gzipStream = createGzip()11 const destStream = createWriteStream(dest)12 13 14 15 16 @loige53
  107. 107. // stream-copy-gzip-pipe.js const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv const srcStream = createReadStream(src) const gzipStream = createGzip() const destStream = createWriteStream(dest) srcStream .pipe(gzipStream) .pipe(destStream) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 srcStream .pipe(gzipStream) .pipe(destStream) // stream-copy-gzip-pipe.js1 2 const {3 createReadStream,4 createWriteStream5 } = require('fs')6 const { createGzip } = require('zlib')7 8 const [, , src, dest] = process.argv9 const srcStream = createReadStream(src)10 const gzipStream = createGzip()11 const destStream = createWriteStream(dest)12 13 14 15 16 @loige53
  108. 108. readable .pipe(decompress) .pipe(decrypt) .pipe(convert) .pipe(encrypt) .pipe(compress) .pipe(writeToDisk) Setup complex pipelines with pipe @loige This is the most common way to use streams 54
  109. 109. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige55
  110. 110. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige   handleErr should end and destroy the streams (it doesn't happen automatically)   55
  111. 111. readable .on('error', handleErr) .pipe(decompress) .on('error', handleErr) .pipe(decrypt) .on('error', handleErr) .pipe(convert) .on('error', handleErr) .pipe(encrypt) .on('error', handleErr) .pipe(compress) .on('error', handleErr) .pipe(writeToDisk) .on('error', handleErr) Handling errors (correctly) @loige   handleErr should end and destroy the streams (it doesn't happen automatically)   55
  112. 112. 04. STREAM UTILITIES04. STREAM UTILITIES @loige56
  113. 113. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige57
  114. 114. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipeline( ) // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige57
  115. 115. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipeline( ) // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 10 11 12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 )21 stream.pipeline(...streams, callback) - Node.js 10+ @loige57
  116. 116. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipeline( ) // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 10 11 12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 )21 function onEnd (err) { } // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 20 )21 stream.pipeline(...streams, callback) - Node.js 10+ @loige57
  117. 117. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipeline( ) // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 10 11 12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 )21 function onEnd (err) { } // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 20 )21 if (err) { console.error(`Error: ${err}`) process.exit(1) } // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 14 15 16 17 18 console.log('Done!')19 }20 )21 stream.pipeline(...streams, callback) - Node.js 10+ @loige57
  118. 118. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipeline( ) // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 10 11 12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 )21 function onEnd (err) { } // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 20 )21 if (err) { console.error(`Error: ${err}`) process.exit(1) } // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 14 15 16 17 18 console.log('Done!')19 }20 )21 console.log('Done!') // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 19 }20 )21 stream.pipeline(...streams, callback) - Node.js 10+ @loige57
  119. 119. // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 pipeline( ) // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 21 createReadStream(src), createGzip(), createWriteStream(dest), // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 10 11 12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 )21 function onEnd (err) { } // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 20 )21 if (err) { console.error(`Error: ${err}`) process.exit(1) } // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 14 15 16 17 18 console.log('Done!')19 }20 )21 console.log('Done!') // stream-copy-gzip-pipeline.js1 2 const { pipeline } = require('stream')3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pipeline(9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 19 }20 )21 // stream-copy-gzip-pipeline.js const { pipeline } = require('stream') const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pipeline( createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 stream.pipeline(...streams, callback) - Node.js 10+ @loige You can pass multiple streams (they will be piped) The last argument is a callback. If invoked with an error, it means the pipeline failed at some point. All the streams are ended and destroyed correctly. 57
  120. 120. // stream-copy-gzip-pump.js const pump = require('pump') // from npm const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 For Node.js < 10: pump - npm.im/pump @loige58
  121. 121. // stream-copy-gzip-pump.js const pump = require('pump') // from npm const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const pump = require('pump') // from npm // stream-copy-gzip-pump.js1 2 3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pump( // just swap pipeline with pump!9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 )21 For Node.js < 10: pump - npm.im/pump @loige58
  122. 122. // stream-copy-gzip-pump.js const pump = require('pump') // from npm const { createReadStream, createWriteStream } = require('fs') const { createGzip } = require('zlib') const [, , src, dest] = process.argv pump( // just swap pipeline with pump! createReadStream(src), createGzip(), createWriteStream(dest), function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } console.log('Done!') } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const pump = require('pump') // from npm // stream-copy-gzip-pump.js1 2 3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 pump( // just swap pipeline with pump!9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 )21 pump( // just swap pipeline with pump! ) // stream-copy-gzip-pump.js1 2 const pump = require('pump') // from npm3 const { createReadStream, createWriteStream } = require('fs')4 const { createGzip } = require('zlib')5 6 const [, , src, dest] = process.argv7 8 9 createReadStream(src),10 createGzip(),11 createWriteStream(dest),12 function onEnd (err) {13 if (err) {14 console.error(`Error: ${err}`)15 process.exit(1)16 }17 18 console.log('Done!')19 }20 21 For Node.js < 10: pump - npm.im/pump @loige58
  123. 123. pumpify(...streams) -  Create reusable pieces of pipeline npm.im/pumpify @loige Let's create EncGz, an application that helps us to read and write encrypted- gzipped files 59
  124. 124. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 @loige60
  125. 125. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 const encryptStream = createCipheriv('aes256', cipherKey, initVect)18 const gzipStream = createGzip()19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 return stream24 25 @loige60
  126. 126. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 const encryptStream = createCipheriv('aes256', cipherKey, initVect)18 const gzipStream = createGzip()19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 return stream24 25 const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 function createEncgz (secret) {15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 18 19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 return stream24 }25 @loige60
  127. 127. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 const encryptStream = createCipheriv('aes256', cipherKey, initVect)18 const gzipStream = createGzip()19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 return stream24 25 const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 function createEncgz (secret) {15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 18 19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 return stream24 }25 const stream = pumpify(encryptStream, gzipStream) // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 function createEncgz (secret) {15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 const encryptStream = createCipheriv('aes256', cipherKey, initVect)18 const gzipStream = createGzip()19 20 21 stream.initVect = initVect22 23 return stream24 }25 @loige60
  128. 128. // encgz-stream.js - utility library const { createCipheriv, createDecipheriv, randomBytes, createHash } = require('crypto') const { createGzip, createGunzip } = require('zlib') const pumpify = require('pumpify') // from npm // calculates md5 of the secret (trimmed) function getChiperKey (secret) {} function createEncgz (secret) { const initVect = randomBytes(16) const cipherKey = getChiperKey(secret) const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() const stream = pumpify(encryptStream, gzipStream) stream.initVect = initVect return stream } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 function createEncgz (secret) { } // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 const encryptStream = createCipheriv('aes256', cipherKey, initVect)18 const gzipStream = createGzip()19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 return stream24 25 const encryptStream = createCipheriv('aes256', cipherKey, initVect) const gzipStream = createGzip() // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 function createEncgz (secret) {15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 18 19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 return stream24 }25 const stream = pumpify(encryptStream, gzipStream) // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 function createEncgz (secret) {15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 const encryptStream = createCipheriv('aes256', cipherKey, initVect)18 const gzipStream = createGzip()19 20 21 stream.initVect = initVect22 23 return stream24 }25 return stream // encgz-stream.js - utility library1 2 const {3 createCipheriv,4 createDecipheriv,5 randomBytes,6 createHash7 } = require('crypto')8 const { createGzip, createGunzip } = require('zlib')9 const pumpify = require('pumpify') // from npm10 11 // calculates md5 of the secret (trimmed)12 function getChiperKey (secret) {}13 14 function createEncgz (secret) {15 const initVect = randomBytes(16)16 const cipherKey = getChiperKey(secret)17 const encryptStream = createCipheriv('aes256', cipherKey, initVect)18 const gzipStream = createGzip()19 20 const stream = pumpify(encryptStream, gzipStream)21 stream.initVect = initVect22 23 24 }25 @loige60
  129. 129. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 @loige61
  130. 130. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide)1 2 3 const cipherKey = getChiperKey(secret)4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect)5 const gunzipStream = createGunzip()6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 @loige61
  131. 131. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide)1 2 3 const cipherKey = getChiperKey(secret)4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect)5 const gunzipStream = createGunzip()6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() // encgz-stream.js (...continue from previous slide)1 2 function createDecgz (secret, initVect) {3 const cipherKey = getChiperKey(secret)4 5 6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 }10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 @loige61
  132. 132. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide)1 2 3 const cipherKey = getChiperKey(secret)4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect)5 const gunzipStream = createGunzip()6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() // encgz-stream.js (...continue from previous slide)1 2 function createDecgz (secret, initVect) {3 const cipherKey = getChiperKey(secret)4 5 6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 }10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 const stream = pumpify(gunzipStream, decryptStream) return stream // encgz-stream.js (...continue from previous slide)1 2 function createDecgz (secret, initVect) {3 const cipherKey = getChiperKey(secret)4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect)5 const gunzipStream = createGunzip()6 7 8 9 }10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 @loige61
  133. 133. // encgz-stream.js (...continue from previous slide) function createDecgz (secret, initVect) { const cipherKey = getChiperKey(secret) const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() const stream = pumpify(gunzipStream, decryptStream) return stream } module.exports = { createEncgz, createDecgz } 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 function createDecgz (secret, initVect) { } // encgz-stream.js (...continue from previous slide)1 2 3 const cipherKey = getChiperKey(secret)4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect)5 const gunzipStream = createGunzip()6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 const decryptStream = createDecipheriv('aes256', cipherKey, initVect) const gunzipStream = createGunzip() // encgz-stream.js (...continue from previous slide)1 2 function createDecgz (secret, initVect) {3 const cipherKey = getChiperKey(secret)4 5 6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 }10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 const stream = pumpify(gunzipStream, decryptStream) return stream // encgz-stream.js (...continue from previous slide)1 2 function createDecgz (secret, initVect) {3 const cipherKey = getChiperKey(secret)4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect)5 const gunzipStream = createGunzip()6 7 8 9 }10 11 module.exports = {12 createEncgz,13 createDecgz14 }15 module.exports = { createEncgz, createDecgz } // encgz-stream.js (...continue from previous slide)1 2 function createDecgz (secret, initVect) {3 const cipherKey = getChiperKey(secret)4 const decryptStream = createDecipheriv('aes256', cipherKey, initVect)5 const gunzipStream = createGunzip()6 7 const stream = pumpify(gunzipStream, decryptStream)8 return stream9 }10 11 12 13 14 15 @loige61
  134. 134. // encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 @loige62
  135. 135. // encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 6 7 const encgz = createEncgz(secret)8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 @loige62
  136. 136. // encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 6 7 const encgz = createEncgz(secret)8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 const encgz = createEncgz(secret) // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 const [, , secret] = process.argv6 7 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 @loige62
  137. 137. // encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 6 7 const encgz = createEncgz(secret)8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 const encgz = createEncgz(secret) // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 const [, , secret] = process.argv6 7 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 pipeline( ) // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 const [, , secret] = process.argv6 7 const encgz = createEncgz(secret)8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 21 @loige62
  138. 138. // encgz.js - CLI to encrypt and gzip (from stdin to stdout) const { pipeline } = require('stream') const { createEncgz } = require('./encgz-stream') const [, , secret] = process.argv const encgz = createEncgz(secret) console.error(`init vector: ${encgz.initVect.toString('hex')}`) pipeline( process.stdin, encgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 const [, , secret] = process.argv // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 6 7 const encgz = createEncgz(secret)8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 const encgz = createEncgz(secret) // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 const [, , secret] = process.argv6 7 8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 pipeline( ) // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 const [, , secret] = process.argv6 7 const encgz = createEncgz(secret)8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 11 process.stdin,12 encgz,13 process.stdout,14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 21 process.stdin, encgz, process.stdout, // encgz.js - CLI to encrypt and gzip (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createEncgz } = require('./encgz-stream')4 5 const [, , secret] = process.argv6 7 const encgz = createEncgz(secret)8 console.error(`init vector: ${encgz.initVect.toString('hex')}`)9 10 pipeline(11 12 13 14 function onEnd (err) {15 if (err) {16 console.error(`Error: ${err}`)17 process.exit(1)18 }19 }20 )21 @loige62
  139. 139. // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 @loige63
  140. 140. // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 const [, , secret, initVect] = process.argv // decgz.js - CLI to gunzip and decrypt (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createDecgz } = require('./encgz-stream')4 5 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex'))8 9 pipeline(10 process.stdin,11 decgz,12 process.stdout,13 function onEnd (err) {14 if (err) {15 console.error(`Error: ${err}`)16 process.exit(1)17 }18 }19 )20 @loige63
  141. 141. // decgz.js - CLI to gunzip and decrypt (from stdin to stdout) const { pipeline } = require('stream') const { createDecgz } = require('./encgz-stream') const [, , secret, initVect] = process.argv const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) pipeline( process.stdin, decgz, process.stdout, function onEnd (err) { if (err) { console.error(`Error: ${err}`) process.exit(1) } } ) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 const [, , secret, initVect] = process.argv // decgz.js - CLI to gunzip and decrypt (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createDecgz } = require('./encgz-stream')4 5 6 7 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex'))8 9 pipeline(10 process.stdin,11 decgz,12 process.stdout,13 function onEnd (err) {14 if (err) {15 console.error(`Error: ${err}`)16 process.exit(1)17 }18 }19 )20 const decgz = createDecgz(secret, Buffer.from(initVect, 'hex')) // decgz.js - CLI to gunzip and decrypt (from stdin to stdout)1 2 const { pipeline } = require('stream')3 const { createDecgz } = require('./encgz-stream')4 5 const [, , secret, initVect] = process.argv6 7 8 9 pipeline(10 process.stdin,11 decgz,12 process.stdout,13 function onEnd (err) {14 if (err) {15 console.error(`Error: ${err}`)16 process.exit(1)17 }18 }19 )20 @loige63

×