3. JSON
• It stands for Javascript Object Notation.
• It is a lightweight format for storing and transporting data.
• It is often used when data is sent from a server to a webpage.
• The JSON format is syntactically similar to the code for creating
JavaScript objects. Because of this, a JavaScript program can easily
convert JSON data into JavaScript objects.
• Since the format is text only, JSON data can easily be sent between
computers.
4. JSON Syntax Rules
• JSON syntax is derived from JavaScript object notation syntax:
• Data is in name/value pairs
• Data is separated by commas
• Curly braces hold objects
• Square brackets hold arrays
JSON Data - A Name and a Value
• JSON data is written as name/value pairs (aka key/value pairs).
• A name/value pair consists of a field name (in double quotes), followed by a
colon, followed by a value:
5. • JavaScript has a built in function for converting JSON strings into
JavaScript objects:
JSON.parse()
• JavaScript also has a built in function for converting an object into a
JSON string:
JSON.stringify()
• JSON has dta in form of key: value pair such as objects
• The main difference between an javascript object and json is that key
and value both are also in “double quotes”
6. Example
• Create a simple object as given below:
const data={
name: "divya",
age: 25,
department:"cse"
};
console.log(data); // to access the object data
o/p : { name: 'divya', age: 25, department: 'cse' }
7. Now to convert this above object to json STRING :
• JSON.strigify() method is use:
const jsonData=JSON.stringify(data);
console.log(jsonData);
o/p :
{"name":"divya","age":25,"department":"cse"}
• It is important to use double quotes for property as well as value as
shown the example above .
• If we try to call the json property directly , we will not get data with .
dotation also .There are various ways to call json property.
8. To convert json data into javascript string
• JSON.parse() method is used:
const objdata=JSON.parse(jsonData);
console.log(objdata);
o/p:
{ name: 'divya', age: 25, department: 'cse' }
9. TASK
• Create a object .
• Convert it into JSON.
• Add this data to another file.
• Read the data from that file.
• Again Convert it back to JS Object.
• Show data on the console.
10. Buffer Module
• The buffers module provides a way of handling streams of binary data.
• Buffer objects are used to represent a fixed-length sequence of bytes. Many
Node.js APIs support Buffers.
• The Buffer object is a global object in Node.js, and it is not necessary to
import it using the require keyword.
• The Buffer class in Node.js is used to perform operations on raw binary data.
• Generally, Buffer refers to the particular memory location in memory.
• Buffer and array have some similarities, but the difference is array can be any
type, and it can be resizable. Buffers only deal with binary data, and it can not
be resizable.
• Each integer in a buffer represents a byte. console.log() function is used to
print the Buffer instance.
11. Creating Buffers
1. const buf1 = Buffer.alloc(10);
// Creates a zero-filled Buffer of length 10.
2. const buf2 = Buffer.alloc(10, 1);
// Creates a Buffer of length 10,
// filled with bytes which all have the value `1`.
3. const buf3 = Buffer.allocUnsafe(10);
Creates an uninitialized buffer of length 10.
// This is faster than calling Buffer.alloc() but the returned
Buffer instance might contain old data that needs to be
overwritten using fill(), write(), or other functions that fill
the Buffer's contents.
12. 4. const buf4 = Buffer.from([1, 2, 3]);
// Creates a Buffer containing the bytes [1, 2, 3].
5. const buf6 = Buffer.from('tést');
// Creates a Buffer containing the UTF-8-encoded bytes for the string
'tést':
// [0x74, 0xc3, 0xa9, 0x73, 0x74] (in hexadecimal notation)
// [116, 195, 169, 115, 116] (in decimal notation)
13. Buffers and character encodings#
• When converting between Buffers and strings, a character encoding
may be specified. If no character encoding is specified, UTF-8 will be
used as the default.
const buffer = Buffer.from('Divya', 'utf8');
1. console.log(buf.toString('hex'));
2.console.log(buf.toString('base64'));
Similarly can be done for various enodings techniques.
15. STREAMS
• Streams are the objects that let you read data from a source or write
data to the destination in continuous manner.
• In nodejs there are four types of streams:
1. Readable : Stream which is used for read operation.
2. Writable : Stream which is used for write operation.
3. Duplex : Stream used for both read and write operation
4. Transform : A type of duplex stream where output is
computed based on input
Streaming means watching to videos or listening music in real time
instead of downlloading a file into the computer and watching it later .
eg : Netflix,Spotify .
16. • The node:stream module provides an API for implementing the
stream interface.
• The Stream module provides a way of handling streaming data.
• Each type of stream is an EventEmitter instance and throws several
events at different instances of time.
• Some of the most commonly used events are :
• DATA : event is fired when there is data available to read.
• END : event is fired when there is no data left to read.
• ERROR :event is fired when there is an error receiving or writing data.
• FINISH : event is fired when no data to read and everything is removed.
• To access the node:stream module:
const stream = require('node:stream');
17.
18. COMPRESSING &DECOMPRESSING DATA WITH ZLIB.
• What is data compression?
When working with large systems or moving large amounts of data around, it
is helpful to be able to compress/decompress the data. Node.js provides an
excellent library in the Zlib module that allows you to compress and
decompress data in buffers easily and efficiently.
19. Why do we need to compress data?
• Compression in computer terms means reducing the physical size of
data such that it occupies less storage space and memory,
Compressed files are, therefore, easier to transfer because there is a
sizable amount of reduction in the size of data to be transferred.
• This results in a reduction in the time needed for file transfer as well
as a reduction in the bandwidth utilization even for a slow network.
• Thus, the process of reducing the amount of data required to
represent information is called compression.
20. There are two types of
compression: lossless and lossy.
• Lossless compression algorithms reduce the size of files without
losing any information in the file, which means that we can
reconstruct the original data from the compressed file.
• Lossy compression algorithms reduce the size of files by discarding
the less important information in a file, which can significantly reduce
file size but also affect file quality.
• Zlib - The Node.js Zlib module is used to provide compression and
decompression (zip and unzip) functionalities. It is implemented using
Gzip and deflate/inflate.
• The zlib module can be accessed using:
const zlib = require('zlib');
21. Node.js fs.createReadStream() Method
• The createReadStream() method is an inbuilt application programming
interface of fs module which allow you to open up a file/stream and read the
data present in it.
fs.createReadStream( path, options )
Parameters: This method accept two parameters as mentioned above and
described below:
1. path: This parameter holds the path of the file where to read
the file. It can be string, buffer or URL.
2. options: It is an optional parameter that holds string or object.
Return Value: This method returns the fs.ReadStream object.
22. Node.js ZLIB Example: Compress File
• Let's see a simple example of Node.js ZLIB module to compress a file
"test.txt" into "test.txt.gz".
• const zlib = require('zlib');
• const gzip = zlib.createGzip();
• const fs = require('fs');
• const inp = fs.createReadStream('test.txt');
• const out = fs.createWriteStream('test.txt.gz');
• inp.pipe(gzip).pipe(out);
24. API
• API stands for application Programming Interface.
• API allows s/w’s to communicate with eachother i.e, allows two
applications to talk to eachother.
• API is like a service which allows us to request the data.
• API is in JSON format.