Study notes (31): Node.js basic knowledge (below)

Study notes (31): Node.js basic knowledge (below)

NodeJS common API

Buffer

1. ArrayBuffer

  • (Note: ArrayBuffer does not specifically refer to Buffer in node.js).
  • The ArrayBuffer instance object is used to represent a binary buffer with a fixed byte length.
  • ArrayBuffer cannot be operated directly, it needs to pass
    Strongly typed array object
    or
    DataView
    Objects operate, they will represent the data in the buffer in a specific format, and read and write the contents of the buffer through these formats.
  • Create ArrayBuffer.
new ArrayBuffer (length) //Parameters: length represents the size of the created ArrayBuffer, in bytes. //Return value: An ArrayBuffer object of a specified size, and its content is initialized to 0. //Exception: If length is greater than Number .MAX_SAFE_INTEGER (> = 2 ** 53) or a complex, RangeError exception is thrown. duplicated code
  • Compare ArrayBuffer and TypedArray
var buffer = new ArrayBuffer ( 8 ); var view = new Int16Array (buffer); console .log(buffer); console .log(view); //Output result: //ArrayBuffer { //[Uint8Contents]: <00 00 00 00 00 00 00 00>, //byteLength: 8 //} //Int16Array(4) [0, 0, 0, 0] //Reason ArrayBuffer displays a view of Uint8Contents 8-bit unsigned content when outputting the log //1Byte = 8bit 1 byte = 8 bits (bits) //The number of bytes occupied by a single element in the 8-bit view is 1, so it is divided into 8 elements. //Int16Array is a 16-bit signed strongly typed array. //1Byte = 8bit 1 byte = 8 bits (bits) //2Byte = 16bit So the word occupied by a single element in Int16Array The number of sections is 2 //so 4 elements are needed. Copy the code

2. Uint8Array

  • The Uint8Array instance object represents an 8-bit unsigned integer array.
  • When created, the content is initialized to 0.
  • After creation, you can pass
    .
    Operator or
    Array subscript
    Refer to the elements in the array.
//Create var uint8 = new Uint8Array ( 2 ); uint8[ 0 ] = 42 ; console .log(uint8[ 0 ]); //42 console .log(uint8.length) //2 console .log(uint8.BYTES_PER_ELEMENT) //1 //Create var arr = new Uint8Array ([ 21 , 31 ]); console .log(arr[ 1 ]); //31 //Create the array through another type var X = new new Uint8Array ([ 21 is , 31 is ]); var Y = new new Uint8Array (X); Console .log (Y [ 0 ]); //21 is duplicated code
//Expansion //The number of bytes used by strongly typed array objects to be interpreted as a single element is different. //The constant BTTES_PER_ELEMENR represents the number of bytes occupied by each element in a specific strongly typed array. //Converted to the previous one About Int8Array .BYTES_PER_ELEMENT; //1 Uint8Array .BYTES_PER_ELEMENT; //1 Uint8ClampedArray .BYTES_PER_ELEMENT; //1 Int16Array .BYTES_PER_ELEMENT; //2 Uint16Array .BYTES_PER_ELEMENT; //2 Int32Array .BYTEEMENTS_PER_ELEMENT; //4 Uint32Array .BYTES_PER_ELEMENT; // Float32Array .BYTES_PER_ELEMENT; //4 Float64Array .BYTES_PER_ELEMENT; //8 copy the code

3. The relationship between ArrayBuffer and TypedArray

  • ArrayBuffer: itself is just a storage binary
    (0 and 1)
    The collection does not specify how these binaries are allocated to the elements of the array, so ArrayBuffer itself cannot be directly operated, only through
    View
    To operate.
  • because
    View
    Some interfaces of arrays are deployed, which means that we can use arrays to manipulate the memory of ArrayBuffer.
  • TypedArray: a strongly typed array, which provides a for ArrayBuffer
    View
    , The read and write of its subscript will eventually be reflected on the created ArrayBuffer.

4. Buffer in NodeJS

Buffer creation
  • Buffer
    Constructor, implemented in a way that is suitable for Node.js
    UintArray API
    .
  • Buffer
    Instance, similar to an integer array, but
    Buffer
    The size of is fixed, and physical memory is allocated outside the V8 heap.
  • Buffer.alloc
    Case study
//Create a Buffer with a length of 10 and filled with 0. const buf1 = Buffer.alloc( 10 ); //<Buffer 00 00 00 00 00 00 00 00 00 00> //Create a length of 10 and fill the Buffer with 0x1. const buf2 = Buffer.alloc( 10 , 1 ) //1 is decimal, but the output is hexadecimal such as 10 -> 0a //<Buffer 01 01 01 01 01 01 01 01 01 01> Copy code
  • transfer
    Buffer.allocUnsafe
    When the allocated memory segment is uninitialized, this design makes the memory allocation very fast.
  • But this is because through
    Buffer.allocUnsafe
    The memory of the created Buffer is not completely rewritten, so that if the allocated memory segment contains sensitive old data and the Buffer memory is readable, it may cause the leakage of old data
  • This introduces security loopholes to the program.
  • Buffer.allocUnsafe
    Case study
//Create an uninitialized Buffer with a length of 10. //This method is faster than calling Buffer.alloc(). //But the returned Buffer instance may contain old data. //Therefore, fill() or write( ) rewriting. const buf3 = Buffer.allocUnsafe ( 10 ); duplicated code
  • Buffer.from
    Case study
//Create a Buffer containing [0x1,0x2,0x3] //Note that the numbers are in hexadecimal by default. const buf4 = Buffer.from([ 1 , 2 , 3 ]); //Create Buffer contains a UTF-8 byte const BUF5 Buffer.from = ( "Test" ); duplicated code
Buffer character encoding
  • Buffer instances are generally used to represent sequences of encoded characters, such as UTF-8, UCS2, Base64, or hexadecimal encoded data.
  • By using explicit character encoding, you can convert between Buffer instances and ordinary JavaScript characters.
  • Buffer
    versus
    String
    Mutual transfer case
const buf = Buffer.from( 'hello world' , 'ascii' ); console .log(buf) //output 68656c6c6f20776f726c64 (hex) Console .log (buf.toString ( 'hex' )); //output = aGVsbG8gd29ybGQ (Base64) Console .log (buf.toString ( 'Base64' )) copying the code
  • The character encoding currently supported by node.js includes
    1. 'ascii'- only supports 7-bit ASCII data, this encoding is very fast if you set it to remove the high bits.
    2. 'utf8'-Multi-byte encoded unicode characters. Many web pages and other document formats use UTF-8.
    3. 'utf-16le'- 2 or 4 bytes, little-endian encoding Unicode characters. Support surrogate pairs (U+10000 to U+10FFFFF).
    4. 'ucs2'-alias of'utf16le'.
    5. 'base64'-Base64 encoding. When creating a Buffer from a string, according to the provisions of Chapter 5 of RFC4648, this encoding will also correctly accept the "URL and file name secure character table".
    6. 'latin1'-a way to write Buffer encoding as a one-byte encoded string ((defined by IANA in RFC1345 page 63, used as Latin-1 supplementary block and C0/C1 control code).
    7. 'binary'-alias of'latin1'.
    8. 'Hex'-Encode each byte as two hexadecimal characters.

5. Buffer memory management

Node memory management
  • We call all the memory occupied by this process during the running of the Node program as resident memory.
  • Resident memory consists of code area, stack, heap, and off-heap memory.
  • The code area, stack, and heap are managed by v8, while off-heap memory is not managed by v8.
Diagram Resident memory ______________________|_____________________ | | | | Code area stack heap off-heap memory |______________|______________| | Managed by V8 but not managed by v8 Copy code
Buffer memory allocation
  • We know that the memory allocation of the Buffer object is not in the v8 heap memory, but the memory application implemented by the C++ level of node.

  • the reason:

    • Since the storage space of large objects is uncertain, it is impossible to apply to the operating system, which will put pressure on the operating system.
    • Therefore, node applies the strategy of applying for memory at the C++ level and allocating memory in JavaScript on the use of memory.
  • Introduce the slab memory allocation mechanism: (secondary)

    • use
      Apply in advance, assign afterwards
      The way.
    • Simply put, it is a fixed-size memory area that has been applied for. It has the following three states:
      • full: Fully allocated.
      • partial: Partial allocation.
      • empty: No allocation.
    • This mechanism is based on
      8KB is the limit
      To determine whether the currently allocated object is a large object or a small object, that is, the value of each slab.
    • At the JavaScript level, memory is allocated with slab as a unit.
  • summary:

    • An 8KB memory space (memory pool) is initialized when it is first loaded.
    • According to the requested memory size, it is divided into small Buffer objects and large Buffer objects.
      • For small Buffer (less than 4kb) object memory allocation: determine whether the remaining space of this slab is enough to accommodate.
        • If it is enough, use the remaining space allocation (pool offset increases).
        • If it is not enough, reapply for a block of 8KB of memory to allocate. (tips: createPool)
      • For large Buffer (greater than 4kb) object memory allocation:
        • Assign a size you need directly at the C++ level. (tips:createUnsafeBuffer)
    • Whether it is a small Buffer object or a large Buffer object, the memory allocation is done at the C++ level, and the memory management is at the JavaScript level. Finally, it can be recycled by the garbage collection mark of V8, but the Buffer itself is recycled, and the memory outside the heap is recycled. Part of the functions are handed over to C++;
FastBuffer
  • In addition to the Buffer class, there are also the FastBuffer class
  • We know that Uint8Array can be called like this:
Uint8Array (length); //Pass length Uint8Array (typedArray); //Pass strongly typed array Uint8Array (object); //Pass object Uint8Array (buffer[,byteOffset[,length]]); //Pass Buffer, optional word offset section, optional length. duplicated code
  • The class declaration of FastBuffer is as follows:
class FastBuffer extends Uint8Array { constructor ( arg1,arg2,arg3 ) { super (arg1,arg2,arg3); } ... } Copy code
  • Buffer.from can be called like this:
//Buffer.from belongs to the category of factory functions Buffer.from(str[,encoding]) //through string [and encoding] (it is more convenient than type array here) Buffer.from(array); //through array Buffer.from (Buffer); //by Buffer Buffer.from (arraybuffer [, ByteOffset [, length]]) //by Buffer, optional byte offset, optional length. duplicated code
  • FastBuffer summary:
    1. When the encoding is not set, utf8 encoding is used by default.
    2. When the number of bytes required by the string is greater than 4kb, the memory allocation is performed directly.
    3. When the number of bytes required for the string is less than 4kb, but exceeds the pre-allocated 8kb memory pool space, reapply for the 8kb memory pool.
    4. When the FastBuffer object is created, data will be stored, and then the length will be checked, and the poolOffset offset and byte alignment will be updated.

Buffer commonly used static methods

  • Buffer.byteLength(string)
    : Get the byte length of a string
Console .log (Buffer.byteLength ( "the Hello world" )) Copy the code
  • Buffer.isBuffer(any)
    : Assertion buffer
Console .log (Buffer.isBuffer ( "Not It apos Buffer A" )) Console .log (Buffer.isBuffer (Buffer.alloc ( 10 ))) copying the code
  • Buffer.concat(Buffer[],byteLength?)
    : Merge buffer
const buffer1 = Buffer.from( "hello" ); const buffer2 = Buffer.from( "world" ); console .log(Buffer.concat([buffer1,buffer2])) console .log(Buffer.concat([buffer1, Buffer2], 12 is )) copying the code

Buffer common example methods

  • buf.write(sting[,offset[,length]][,encoding])
    : Write string to buffer
const buf1 = Buffer.alloc( 20 ); console .log( "Create empty buffer" ,buf1); buf1.write( 'hello' ); console .log( `buf1.write("hello"): write hello` ) console .log(buf1); buf1.write( "hello" , 5 , 3 ) console .log( `buf1.write("hello",5,3): offset by five bytes, then write the first three bytes of hello` ) console .log (of buf1) Console .log ( `output string` ) Console .log (buf1.toString ()) copying the code
  • buf.fill(value[,offset[,end]][,encoding])
    : Fill the buffer
  • analogy
    buf.write
    parameter:
    • buf.write
      : String + offset + length + encoding
    • buf.fill
      : Arbitrary value + offset + end position + coding method
  • analogy
    buf.write
    significance:
    • buf.write
      : Write as much content as there is, unless offset and length are specified
    • buf.fill
      : Repeat until the container is filled, unless the offset and end point are specified
const buf1 = Buffer.alloc( 20 ); buf1.fill( "hello" ); console .log(buf1); const buf2 = Buffer.alloc( 20 ); buf2.fill ( "Hello" , . 4 , . 6 ); Console .log (BUF2); duplicated code
  • buf.length
    :buffer length
  • Analogy static method
    Buffer.byteLength(string)
    :
    • Buffer.byteLength(string)
      : Input a string and return the length in bytes.
    • buf.length
      : Returns the byte length of the buffer instance.
const buf1 = Buffer.alloc( 10 ); console .log(buf1.length); const buf2 = Buffer.from( "eric" ); console .log(buf2.length); console .log(Buffer.byteLength( "eric " )); Copy code
  • buf.toString([encoding[,start[,end]]])
    : Decode the buffer into a string
    • encoding
      : The character encoding used, the default value is'utf8'.
    • start
      : The byte index to start decoding. The default value is 0.
    • end
      : End decoding byte index (not included)","Default value: buf.length
const buf = Buffer.from ( 'Test' ); Console .log (buf.toString ( 'UTF8' , . 1 , . 3 )); duplicated code
  • buf.toJSON()
    : Return the JSON format of the buffer
const buf = Buffer.from ( "Test" ); Console .log (buf.toJSON ()); //{type: 'Buffer', Data: [1 16, 101, 115, 1 16]} copy the code
  • buf.equals(otherBuffer)
    : Check whether other buffers have exactly the same bytes
const ABC = Buffer.from( 'ABC' ); const hex414243 = Buffer.from( '414243' , 'hex' ); const ABCD = Buffer.from( 'ABCD' ); console .log( ABC, hex414243, ABCD ); console .log(ABC.equals(hex414243)) console .log(ABC.equals(ABCD)) //output is as follows: //<Buffer 41 is 42 is 43 is> <Buffer 41 is 42 is 43 is> <Buffer 41 is 42 is 43 is 44 is> //to true //to false copy the code
  • buf.slice([start[,end]])
    : Intercept string
    • start
      :new
      Buffer
      Start position. Default value: 0
    • end
      :new
      Buffer
      The ending position.
const buf1 = Buffer.from( "abcdefghi" ) console .log(buf1.slice( 2 , 7 ).toString()); //output result: //cdefg const buf2 = Buffer.from( "abcdefghi" ) console . log (buf2.toString ( "UTF8" , 2 , . 7 )); //output: //CDEFG copy the code
  • buf.copy(target[,targetStart[,sourceStart[,sourceEnd]]])
    : Copy buffer
    • target
      : To copy Buffer and Uint8Array.
    • targetStart
      : The number of bytes to be skipped before starting to write in the target buffer. The default is 0.
    • sourceStart
      : The index in the source buffer to start copying. The default value is 0.
    • sourceEnd
      : The index of the end copy in the source buffer (not included). Default: buf.length.
    • source.copy(target): Copy several values in source to target
    • Returns the length of the copied part.
    • What changed is the target
const buf1 = Buffer.from( "abcdefghi" ); const buf2 = Buffer.from( "test" ); console .log(buf1.copy(buf2)) //output 4 console .log(buf2.toString) //output "abcd" Copy code

Stream

concept

  • Speaking of
    flow
    , First we need to know
    Streaming data
    ,
  • Streaming data
    Just
    Byte data
    ,
  • When exchanging and transmitting data between various objects in the application, the data contained in the object is always first converted into streaming data, and then transmitted through the stream.
  • After reaching the target object, the stream data is converted into data that can be used by the object.
  • Therefore, the stream is used to transmit stream data, which is a kind of
    Means of transmission
    .
  • Streaming applications:
    • http request and response
    • socket in http
    • As well as compression and encryption, etc.

Why do you need a stream?

  • in
    Streaming
    In, we don t need to load all the data into memory at once, so
    Low memory footprint
    .
  • There is no need to wait for all data transmission to be completed before processing, so
    Time utilization is higher
    .
  • For example in
    Transfer files
    The scene:
    • For smaller files, we can write all the files into the memory and then write the files.
    • However, for large binary files, such as audio and video files, the size is several GB. If you use this method, it is easy to cause memory explosion.
    • At this time, we need to use churn transfer, read a part, write a part, no matter how big the file is, as long as time permits, it will always be processed.

flow in node

  • In node
    flow
    It is an abstract interface that is implemented by many objects in node.
  • In node, the data stream processed by default is
    Buffer/String
    Type, but if set
    objectMode
    , You can let it receive any
    JavaScript object
    , The flow at this time is called
    Object stream
    .
  • There are four basic stream types in node.js:
Types ofChineseCase study
ReadableReadable streamfs.createReadStream()
WritableWritable streamfs.createWriteStream()
DuplexReadable and writable streamnet.Socket
TransformDuplex stream that can modify and transform data during reading and writingzlib.createDeflate()
  • Base class for all streams
    require('events').EventEmitter
    .
  • At the same time we can pass
    const {Readable,Writable,Duplex,Transform} = require('stream')
    Load 4 kinds of stream base classes.

Readable stream

  • The readable stream is an abstraction of the source of the provided data.
  • Examples of readable streams:
    • Standard input process.stdin
    • Child process standard output and error output child_process.stdout, child_process.stderr
    • File read stream fs.createReadStream
    • The response received by the client
    • Request received by the server
  • The readable stream has two modes, which can be switched at any time:
    • Flow mode: Automatically read data.
    • Pause mode: Pause reading data.
    • API to switch to mobile mode:
      1. Listen for data events.
      2. Call the resume method in pause mode
      3. Call pipe to send data to a writable stream
    • API to switch to pause mode:
      1. Call the pause method in flow mode
      2. Call unpipe method in flow mode
  • Readable stream can monitor events:'data','error','end','close','readable'.
  • Other methods: destroy
  • All Readables are implemented
    stream.Readable
    The interface defined by the class.
  • Custom readable stream
const Readable = require ( 'stream' ).Readable; class CustomReadStream extends Readable { constructor ( source,opt ) { /** The configuration items are passed to the base class*/ super (opt) this .source = source; this .index = 0 ; } _read ( highWaterMark ) { if ( this .index === this .source.length) { this .push( null ) } else { this .source /** intercept chunk section*/ .slice( this .index, this .index+highWaterMark) .forEach( element => { /** Note that it turns into a string */ this .push(element.toString()) }) } this .index+=highWaterMark } } const customStream = new CustomReadStream( //[1,2,3,4,5,6,7,8,9,0], [ "A" , "B" , "C" , "D" , "E" , "F" , "G" ], /** Set the buffer size to 2 */ { highWaterMark : 2 } ) customStream.on( 'data' , chunk => { /** If it is console.log, each output will occupy one line*/ process.stdout.write(chunk); }) Copy code
  • fs.createReadStream
    Use cases
const fs = require ( 'fs' ); const path = require ( 'path' ) const rs = fs.createReadStream( path.resolve(process.cwd(), 'example.md' ), { flags : 'r' , //What kind of operation do we want to perform on the file mode : 0o666 , //Permission bit encoding : 'utf8' , //If you don't wear it, the default is buffer, displayed as a string //start: 3,//Start reading from index 3. //end: 8,//read to the end of index 8 highWaterMark : 3 , //buffer size } ) /** Open file prompt*/ rs.on( 'open' , function () { process.stdout.write( 'open the file' ) }) /** Display as a utf8 string*/ rs.setEncoding( 'utf8' ); /** Add a pause and resume mechanism in data monitoring*/ rs.on( 'data' , function ( data ) { process.stdout.write(data) rs.pause(); //Pause reading and transmitting data events setTimeout ( function () { rs.resume(); //Resume reading and trigger data event ), 2000 ) }) /** If an error occurs during the reading night, an error event will be triggered*/ rs.on( 'error' , function () { process.stdout.write( 'error' ); }) /** If the content is read, the end event will be triggered*/ rs.on( 'end' , function () { process.stdout.write( 'finish' ); }) rs.on( 'close' , function () { process.stdout.write( 'close the file' ) }) Copy code
Fake code: rs = fs.createReadStream(filePath,option) rs.on("open"|"data"|"error"|"end"|"close") rs.pause rs.resume rs.destroy Copy code

Writable stream

  • A writable stream is an abstraction for writing data to the'destination'.
  • Examples of writable streams:
    • Standard output, error output: process.stdout, process.stderr
    • Child process standard input: child_process.stdin
    • File write stream: fs.createWriteStream
    • The request sent by the client: request
    • The result returned by the server: response
    • fs write streams
  • Writable stream can monitor events:'drain','error','close','finish','pipe','unpipe'
  • Custom writable stream
const Writable = require ( 'stream' ).Writable; class CustomWritable extends Writable { constructor ( arr,opt ) { super (opt); //Point the pointer of this.arr to the address corresponding to arr this .arr = arr; } //Implement the _write() method _write ( chunk, encoding, callback ) { this .arr.push(chunk.toString()); callback(); } } const data = [] const customWritable = new CustomWritable( data, { highWaterMark : 3 } ) customWritable.write( '1' ) customWritable.write( '2' ) customWritable.write( '3' ) console .log(data) Copy code
  • fs.createWriteStream
    Use Cases
let fs = require ( 'fs' ); let path = require ( 'path' ) let ws = fs.createWriteStream( path.resolve(process.cwd(), 'example.md' ), { flags : 'w' , mode : 0o666 , start : 3 , highWaterMark : 3 //The default is 16k. } ) let flag = ws.write( '1' ); process.stdout.write (flag.toString ( 'UTF8' )); //to true -> Boolean instead of standard output, string, and is the standard output buffer Console .log (In Flag) //to true output may output a non-standard flag = ws .write( '2' ); process.stdout.write(flag.toString( 'utf8' )); //true flag = ws.write( '3' ); process.stdout.write(flag.toString( 'utf8' )); //false -> highWaterMark: 3 returns false if 3 is full flag = ws.write( '4' ); process.stdout.write (flag.toString ( 'UTF8' )); //to false copy the code
  • fs.createWriteStream
    Complex case
const fs = require ( 'fs' ); const path = require ( 'path' ); const ws = fs.createWriteStream( path.resolve(process.cwd(), 'example' ), { flags : 'w' , //what kind of operation do we want to do with the file `w`write mode : 0o666 , //permission bit start : 0 , // start from 0 highWaterMark : 3 //buffer size } ) let count = 9 ; function write () { let flag = true ; while (flag && count> 0 ) { process.stdout.write( `before ${count}/n` ) flag = ws.write( ` $(count) ` , 'utf8' , /** The callback is asynchronous*/ ( ( i )=> ()=> process.stdout.write( `after $(i)/n` ))(count) ) count--; } } write(); ws.on( 'drain' , function () { process.stdout.write( 'drain\n' ); write() }) ws.on( 'error' , function ( error ) { process.stdout.write( ` ${error.toString()}/n` ) }) /** before 9 before 8 before 7 after 9 drain before 6 before 5 before 4 after 8 after 7 after 6 drain before 3 before 2 before 1 after 5 after 4 after 3 drain after 2 after 1 */ //If you no longer need to write, then call the end method to close the write stream ws.end(); Copy code
  • pipe
    It is the simplest and most direct method to connect two streams.The entire process of data transmission is realized internally, and there is no need to pay attention to the flow of internal data during development.
  • pipe
    The principle of simulation,
    by fs.highReadStream and fs.highWriteStream
cconst fs = require ( 'fs' ); const path = require ( 'path' ); const appDirctory = process.cwd() const ws = fs.createWriteStream(path.resolve(appDirctory, 'pipe2.md' )); const rs = fs.createReadStream(path.resolve(appDirctory, 'pipe1.md' )); rs.on( 'data' , function ( chunk ) { /** 1. The data read from the read stream is transferred to the write stream buffer area, (the speed balance between producers and consumers can be achieved) */ const flag = ws.write(chunk); /** 2. If the write stream buffer is full, pause reading the stream*/ if (!flag) rs.pause(); }) ws.on( 'drain' , function () { /** 3. If the write stream buffer is emptied, restart the read stream*/ rs.resume() }) rs.on( 'end' , function () { /** 4. When the read stream is finished reading, we also end the write stream*/ ws.end(); }) Copy code
  • pipe
    Usage
const fs = require ( "fs" ); const path = require ( "path" ); const appdirectory = process.cwd(); const from = fs.createReadStream(path.resolve(appdirectory, 'pipe1.md' )); const to = fs.createWriteStream(path.resolve(appdirectory, 'pipe2.md' )); from .pipe(to); //setTimeout(()=>{ //console.log('Close writing to 2.txt into '); //from.unpipe (to); //the console.log (' manual close file stream '); //to.end () //}, 2000) copy the code
  • Simple implementation
    pipe
fs.createReadStream.prototype.pipe = function ( dest ) { this .on( 'data' , ( data )=> { const flag = dest.write(data); if (!flag) this .pause(); }) dest.on( 'drain' , ()=> { this .resume() }) this .on( 'end' , ()=> { dest.end(); }) } Copy code

Read and write stream (duplex stream)

  • Readable and writable streams are also called duplex streams.
  • Duplex stream
    Readability operation
    with
    Writable operation
    Completely independent of each other, this is just combining two characteristics into one object.
  • The duplex stream implements both the Readable and Writable interfaces.
  • Examples of readable and writable streams:
    • TCP sockets: clien in net.createServer(socket).
      • Readability: socket.on("data",function(data){...})
      • Writability: socket.write('hello word')
  • Custom duplex flow
const Duplex = require ( "stream" ).Duplex; class CustomDuplex extends Duplex { constructor ( arr, opt ) { super (opt); this .arr = arr; this .index = 0 ; } /** Implement the _read method*/ _read ( size /** buffer size*/ ) { if ( this .index >= this .arr.length) { this .push( null ) } else { this .arr.slice( this .index, this .index + size).forEach( ( value ) => { this .push(value.toString()); }) this .index += size; } } /** Implement the _write method*/ _write ( chunk, encoding, callback ) { this .arr.push(chunk.toString()); callback() } } const data = []; const customDuplex = new CustomDuplex(data, { highWaterMark : 3 }); /** Write data to the stream*/ customDuplex.write( "1" /** chunk */ ); customDuplex.write( "2" /** chunk */ ); customDuplex.write( "3" /** chunk */ ); console .log(data); /** Read data from the stream*/ console .log(customDuplex.read( 2 /** size */ ).toString()) console .log(customDuplex.read( 2 /** size */ ).toString ()) Copy code

Conversion flow

  • Conversion flow
    (Transform Streams), in streaming, the input content is transformed before output.
  • Conversion flow
    Is also a
    Duplex flow
    , It also implements the Readable and Wrible interfaces, but we only need to implement the transform method when using it.
  • Conversion flow case:
    • data
      Compress/decompress
      Module
      zlib
      Streaming: such as createGzip/createGunzip, createDeflate/createInflate. (non-streaming method zlib.gzip/unzip)
    • data
      encrypt and decode
      Module
      crypto
      Stream: such as crypto.createCipher/createDecipher
  • Custom conversion flow:
const Transform = require ( "stream" ).Transform; class customTransform extends Transform { constructor ( opt ) { super (opt); } _transform ( chunk, encoding, callback ) { /** * Output the converted data to a readable stream */ this .push(chunk.toString().toUpperCase()); /** * Parameter 1 is the Error object * If parameter 2 is passed in, it will be forwarded to readable.push() */ callback(); } } let t = new customTransform({ highWaterMark : 3 }); t.on( 'data' , function ( data ) { console .log( 'data' ,data.toString()); }); //stdin.pipe(t) means that we write standard input to our conversion stream t, at this time t is a writable stream. //pipe(process.stdout) means that the data in the conversion stream t will be read To the standard output, at this time t is the readable stream. process.stdin.pipe(t).pipe(process.stdout); Copy code

(Expanded) Object flow

  • By default, the stream processed data is
    Buffer/String
    The value of the type.
  • But if it is set
    objectMode
    Attribute, we can make the stream accept any J
    avaScript type
    , We will call it
    Object stream
const Transform = require ( "stream" ).Transform; const fs = require ( "fs" ); const path = require ( "path" ) const appDirectory = process.cwd() const rs = fs.createReadStream(path.resolve( appDirectory, "user.json" )) rs.setEncoding( "utf8" ) const toJSON = new Transform({ readableObjectMode : true , transform : function ( chunk,encoding,callback ) { this .push( JSON .parse(chunk)); callback(); } }) const jsonOut = new Transform({ writableObjectMode : true , transform : function ( chunk,encoding,callback ) { console .log(chunk) callback(); } }) rs.pipe(toJSON).pipe(jsonOut) Copy code
Read'./user.json' of file Set encoding of ReadStream to'utf8' Create transform stream called'toJSON' Create transform stream called'jsonOut' ReadStream -> to json -> json out pipe pipe Copy code

Events

  • Events module is one of the core modules of node. Almost all commonly used node modules inherit the events module, such as http, fs, etc.;
  • Example 1: For
    wakeup event
    , Set a
    Event listener
const EventEmitter = require ( 'events' ).EventEmitter; class Man extends EventEmitter {} const man = new Man(); man.on( 'wakeup' , function () { console .log( 'The man has woken up.' ); }) man.emit( 'wakeup' ); //output is as follows: //at The man has Woken up. Copy the code
  • Example 2: For
    wakeup event
    , Set multiple
    Event listener
  • When an event is triggered, the event listeners will be executed in the order of registration.
const EventEmitter = require ( 'events' ).EventEmitter; class Man extends EventEmitter {} const man = new Man(); man.on( 'wakeup' , function () { console .log( 'The man has woken up.' ); }) man.on( 'wakeup' , function () { console .log( 'The man has woken up again.' ); }) man.emit( 'wakeup' ); //The output is as follows: //The man has woken up. //The man has woken up again. Copy the code
  • Example 3: Register an event listener that runs only once.
    once
    )
const EventEmitter = require ( 'events' ).EventEmitter; class Man extends EventEmitter {} const man = new Man(); man.on( 'wakeup' , function () { console .log( 'The man has woken up.' ); }) man.once( 'wakeup' , function () { console .log( 'The man has woken up again.' ); }) man.emit( 'wakeup' ); //The output is as follows: //The man has woken up. //The man has woken up again. man.emit( 'wakeup' ); //output is as follows: //man has Woken up copy the code
  • Example 4: Before registering an event listener, if the event is triggered first, the event will be ignored.
const EventEmitter = require ( "events" ).EventEmitter; class Man extends EventEmitter {} const man = new Man(); man.emit( 'wakeup' , 1 ) man.on( 'wakeup' , function ( index ) { console .log( 'The man has woken up ->' +index) }) man.emit( 'wakeup' , 2 ) //output is as follows: //of The man has Woken up -> 2 ' copy the code
  • Example 5: Prove that EventEmitter is
    Sequential execution
    Instead of
    Asynchronous execution
const EventEmitter = require ( "events" ).EventEmitter; class Man extends EventEmitter {} const man = new Man(); man.on( 'wakeup' , function () { console .log( 'The man has woken up' ); }) man.emit( 'wakeup' ) console .log( 'The woman has woken up' ); //The output is as follows: //The man has woken up //The woman has woken up //Conclusion: //execute copy code sequentially
  • Example 6: Remove event listener
const EventEmitter = require ( "events" ).EventEmitter; class Man extends EventEmitter {} const man = new Man(); function wakeup () { console .log( 'The man has woken up' ) } man.on( 'wakeup' ,wakeup); man.emit( 'wakeup' ) //The output is as follows: //The man has woken up man.removeListener( 'wakeup' ,wakeup); man.emit ( 'Wakeup' ) //no output copy the code
  • Manual implementation
    EventEmitter
/** * Auxiliary function: * When there is a listener in the parameter, we need to check the listener */ function checkListener ( listener ) { if ( typeof listener !== "function" ) { throw TypeError ( "'listener must be a function." ) } } /** * Event listener constructor */ class EventEmitter { constructor () { this ._events = {} } addListener ( eventName,listener ) { checkListener(listener) if (! this ._events[eventName]) { this ._events[eventName] = [] } this ._events[eventName].push(listener); } on ( eventName,listener ) { this .addListener(eventName,listener); } emit ( eventName,...args ) { const listeners = this ._events[eventName]; if (!listeners) return ; listeners.forEach( fn => fn.apply( this ,args)); } removeListener ( eventName,listener ) { checkListener(listener); const listeners = this ._events[eventName]; if (!listeners) return false ; const index = listeners.findIndex( item => item === listener); if (index ===- 1 ) return false ; listeners.splice(index, 1 ) } off ( eventName,listener ) { this .removeListener(eventName,listener) } removeAllListeners(eventName) { if(this._events[eventName]) { delete this._events[eventName]; } } once(eventName,listener) { checkListener(listener) /** */ const wrap = (...args) => { listener.apply(this,args); this.removeListener(eventName,wrap); } this.addListener(eventName,wrap); } }
EventEmitter |__ addListener on once |__ emit |__ removeListener off removeAllListener

fs

  • node

fs API

  • fs.readFile(path[,option],callback)
    :
    • path
      :
    • encoding?
      :
    • callback(err,data)
      :
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); fs.readFile( path.resolve(appDirectory,"example.md"), "utf8", (err,data) => { if(err) throw err; process.stdout.write(data) } )
  • fs.writeFile(file,data[,option],callback)
    :
    • file
      :
    • data
      :
    • option
      :
      • option.encoding
        : , utf8
      • option.mode
        : ( ) 0o666
      • callback(err)
        :
    • :
      fs.writeFile
      , .
    • . 'w'
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); const data = "It is a test." fs.writeFile( path.resolve(appDirectory,"example.md"), data, (err)=>{ if(err) throw err; console.log(" ") } )
  • fs.appendFile(file,data[,option],callback)
    :
    • file
      :
    • data
      :
    • option
      :
      • option.encoding
        : , utf8
      • option.mode
        : ( ) 0o666
      • callback(err)
        :
    • :
      fs.appendFile
      :
    • . 'a'
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); const data = "It is a content which append" fs.appendFile( path.resolve(appDirectory,"example.md"), data, (err)=>{ if(err) throw err; console.log(" ") } )
  • fs.stat(path[,option],callback)
    : ( )
    • path
      :
    • options
      :
      • options.bigint
        : fs.stat bigint , false
    • callback(err,stats)
      :
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); fs.stat( path.resolve(appDirectory,"example.md"), (err,stats)=>{ if(err) throw err; console.log(stats) /** */ console.log(stats.isFile()) console.log(stats.isDirectory()) } ) //Stats { // dev: 4004941460, // mode: 33206, // nlink: 1, // uid: 0, // gid: 0, // rdev: 0, // blksize: undefined, // ino: 1688849861160504, // size: 56, // blocks: undefined, // atimeMs: 1618224089682.408, // mtimeMs: 1618224089682.408, // ctimeMs: 1618224089682.408, // birthtimeMs: 1616552557350.4214, // atime: 2021-04-12T10:41:29.682Z, // mtime: 2021-04-12T10:41:29.682Z, // ctime: 2021-04-12T10:41:29.682Z, // birthtime: 2021-03-24T02:22:37.350Z } //true //false
  • fs.rename(oldPath,newPath,callback)
    :
    • oldPath
      :
    • newPath
      :
    • callback(err)
      :
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); fs.rename( path.resolve(appDirectory,"rename.md"), path.resolve(appDirectory,"example.md"), (err)=>{ if(err) throw err; console.log(' ') } )
  • fs.unlink(path,callback)
    :
    • path
      :
    • callback(err)
      :
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); fs.unlink( path.resolve(appDirectory,"example.md"), (err)=>{ if(err) throw err; console.log(' ') } )
  • fs.mkdir(path[,options],callback)
    :
    • path
      :
    • options
      :
      • options.recursive
        : . false( )
      • options.mode
        : ( )Windows . :0o777
      • callback(err)
        :
/** */ const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); fs.mkdir( path.resolve(appDirectory,"example"), (err)=> { if(err) throw err; console.log(" ") } ) /** */ fs.mkdir( path.resolve(appDirectory,"a/b"), { recursive:true }, err => { if(err) throw err; console.log(" "); } )
  • fs.readdir(path[,options],callback)
    :
    • path
      :
    • options
      :
      • options.encoding
        : , 'utf8'.
      • options.withFileTypes
        : false.
    • callback(err,files<string[]>|<buffre[]>|<fs.Dirent[]>)
      ;
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); fs.readdir( path.resolve(appDirectory,"b"), { encoding:'buffer', // buffer,files buffer withFileTypes: true // }, (err,files) => { if(err) throw err; console.log(files); } ) //[ // Dirent { name: <Buffer 63>, [Symbol(type)]: 2 } //]
  • fs.rmdir(path[,options],callback)
    :
    • path
      :
    • options
      :
      • options.maxRetries
        : . EBUSY,EMFILE,ENFILE,ENOTEMPTY EPERM, . recursive true , :0
      • options.retryDelay
        : , recursive true . :100
      • options.recursive
        : true, . , path
    • callback(err)
      :
const fs = require("fs"); const path = require("path"); const appDirectory = process.cwd(); fs.rmdir( path.resolve(appDirectory,"b"), { recursive: true // , , }, err => { if(err) throw err; console.log(" ") } ) /** https://blog.csdn.net/a8039974/article/details/25830705 */

node.js

  • chokidar
npm install chokidar --save-dev
const chokidar = require("chokidar"); chokidar .watch( process.cwd(), { ignorad: './node_modules' } ) .on( 'all', (event,_path) => { console.log(' ',event,_path) } )

path

  • path.basename(path[,exit])
    :
    path
const path = require("path"); console.log( path.basename( path.resolve(process.cwd(),"example.md"), ".md" ) ) //example
  • path.dirname(path)
    :
const path = require("path"); console.log( path.dirname( path.resolve(process.cwd(),"example.md") ) ) //D:\parent
  • path.extname(path)
    :
const path = require("path"); console.log( path.extname( path.resolve(process.cwd(),"example.md") ) ) //.md
  • path.join([...paths])
    :
    path
const path = require("path"); console.log( path.join("/nodejs/","/example.md") ) ///nodejs\example.md //
  • path.normalize(path)
    :
const path = require("path"); console.log( path.normalize("/nodejs/example2.md/../example.md") ) ///nodejs\example.md //
  • path.resolve([..paths])
    :
    • path.join()
      :
      • path.resolve()
        :1. .2. .
      • path.join()
        : .
const path = require("path"); console.log( path.resolve("./example.md") ) //D:\parent\example.md console.log( path.resolve(process.cwd(),"/example.md") ) //D:\example.md -> console.log( path.resolve(process.cwd(),"example.md") ) //D:\parent\example.md
  • path.parse(path)
    : , path
const path = require('path'); const pathObj = path.parse('/nodejs/test/index.js'); console.log(pathObj) //{ // root: '/', // dir: '/nodejs/test', // base: 'index.js', // ext: '.js', // name: 'index' //}
  • path.format(pathObject)
    :
const path = require('path'); const pathObj = path.parse('/nodejs/test/index.js'); console.log(pathObj) //{ // root: '/', // dir: '/nodejs/test', // base: 'index.js', // ext: '.js', // name: 'index' //} console.log(path.format(pathObj)); ///nodejs/test\index.js ( )
  • path.sep
    :
const path = require("path") console.log(path.sep) ///
  • path.win32
    : widows path

util( )

  • util.callbackify(original)
    : promise ,
    .
const util = require('util'); async function hello() { return 'hello world' } let helloCb = util.callbackify(hello); helloCb((err,res)=>{ if(err) throw err; console.log(res) })
  • util.promisify(original)
    :
    , promise .
const fs = require("fs"); const util = require("util"); const path = require("path"); const stat = util.promisify(fs.stat); stat( path.resolve(process.cwd(),"example.md") ) .then(data=>{ console.log(" ",data); }) .catch(error=>{ console.error(" ",error); }) // Stats { // dev: 109512952, // mode: 33206, // nlink: 1, // uid: 0, // gid: 0, // rdev: 0, // blksize: 4096, // ino: 140737488355889780, // size: 7, // blocks: 0, // atimeMs: 1618245914082.411, // mtimeMs: 1617905296801.911, // ctimeMs: 1617905296801.911, // birthtimeMs: 1617904969580.9304, // atime: 2021-04-12T16:45:14.082Z, // mtime: 2021-04-08T18:08:16.802Z, // ctime: 2021-04-08T18:08:16.802Z, // birthtime: 2021-04-08T18:02:49.581Z //}
const fs = require("fs"); const util = require("util"); const path = require("path"); const stat = util.promisify(fs.stat); (async function statFn() { try { const data = await stat(path.resolve(process.cwd(),"example.md")) console.log(" ",data) } catch (error) { console.error(" ",error) } })() // Stats { // dev: 109512952, // mode: 33206, // nlink: 1, // uid: 0, // gid: 0, // rdev: 0, // blksize: 4096, // ino: 140737488355889780, // size: 7, // blocks: 0, // atimeMs: 1618245914082.411, // mtimeMs: 1617905296801.911, // ctimeMs: 1617905296801.911, // birthtimeMs: 1617904969580.9304, // atime: 2021-04-12T16:45:14.082Z, // mtime: 2021-04-08T18:08:16.802Z, // ctime: 2021-04-08T18:08:16.802Z, // birthtime: 2021-04-08T18:02:49.581Z //}
  • util.types.isDate(value)
    : Date
const util = require("util"); console.log( util.types.isDate(new Date()) )
  • ,
    lodash
    , , JavaScript , .
npm install lodash --save

  • JavaScript ,
    (Global Object)
    .
    • .
    • .
  • :
    • , window
    • nodejs , global

  • .
  • ECMAScript , :
  • ( European Computer Manufacturers Association Script)
    • ;
    • ;
    • ( );
  • :
    • : , .
    • node.js , , , .
  • :
    • , .
    • .

__filename

  • __filename
    .
  • , ,
  • path.filename()
    .

__dirname

  • __dirname
    .
  • , ,
  • path.dirname()
  • process.cwd()
    node
    .

setTimeout(cb,ms)

clearTimeout

setInterval

clearInterval

console

processs

  • process
    ( global ).
  • node.js
    , .
  • process.on
    :
    • exit
      : .
    • beforeExit
      : , .
      • , ,node , ,
        beforeExit
        , node .
    • uncaughtException
      : .
    • Signal
      : .
      • POSIX , SIGINT,SIGUSR1
process.on('exit',function(code){ // setTimeout(function() { console.log(' ') },0) console.log(' :',code) }) console.log(' ');
  • :
1Uncaught Fatal Exception uncaughtException
3Internal JavaScript Parse ErrorJavaScript Node Node
4Internal JavaScript Evaluation FailureJavaScript Node Node
5Fatal ErrorV8 stderr FATAL ERROR
6Non-function Internal Exception Handler on-function
7Internal Exception Handler Run-Time Failure process.on( uncaughtException ) domain.on( error )
9Invalid Argument
10Internal JavaScript Run-Time FailureJavaScript Node Node
12Invalid Debug Argument debug / debug-brk .
>128Signal Exits Node SIGKILL SIGHUP 128 Unix
  • process.stdout.write
    :
process.stdout.write('hello world!'+"\n");
  • process.argv
    :
    .
console.log(process.argv);
node scripts.js # [ # 'D:\\nodejs\\node.exe', # 'D:\\my_frontend_files\\systematization\\jscripts.js' # ]
  • process.execPath
    :
    .
console.log(process.execPath);
node scripts.js # 'D:\nodejs\node.exe'
  • process.platform
    :
    .
console.log(process.platform);
# 'win32'

this

  • this
    exports
  • :
console.log(this); module.exports.foo = 5; console.log(this); //{} //{ foo: 5 } // this exports

node.js

  • web ,
    .
  • I/O , .
  • node.js
    , node.js .
  • node.js I/O .
  • , I/O .

  • Node.js
    I/O
    .
  • ,Node.js
    , ,
    Node.js
    .
  • Node.js
    , I/O .
  • , .
  • , Node.js,
    .

  • Node.js , :
    • , API, process.nextTick .
  • :
    • FIFO .
    • , , , , , .
    • , .
  • , ,node.js I/O timers, , .
> timers pending callbacks idle, prepare incoming: poll < connections, data, etc. check close callbacks

timers
setTimeout
setInterval
pending callbacks , TCP ( )
idle,prepare ,
poll I/O ; I/O ,( ,setImmdiate, ) , ,node
check
setImmediate()
close callbacks , socket.on("close",...)

timers( )

  • ,
    , .
  • .
  • .
  • :
    • , (fs.readFile ).
    • 95ms ,fs.readFile , 10ms , .
    • , .
    • (timer) (100ms) ,
      .
    • , 105ms.
const fs = require("fs"); function someAsyncOperation(callback) { fs.readFile('/path/to/file',callback); } const timeoutScheduled = Date.now(); setTimeout(()=>{ const delay = Date.now() - timeoutScheduled; console.log( `${delay}ms have passed since I was scheduled` ) },100) someAsyncOperation(()=>{ const startCallback = Date.now(); while(Date.now() - startCallback < 10) { } })

pending callbacks

  • , TCP . .

poll

  • :
    1. :
      ``I/O
    2. :
      (poll queue) .
  • ,
    , :
    1. ,
      , , ( ?).
    2. , :
    2.1
    setImmediate()
    ,
    ,
    ,
    . 2.2
    setImmediate()
    ,
    , .
  • ,
    :
    ?
    .
    ,
    , .

check

  • ,
    setImmediate()
    check ,
    ,
    .
  • poll .
  • setImmediate()
    :
    • ,
      ( ) .
    • libuv API
      , API
      .
  • , , , , .
  • ,
    setImmdiate()
    ,
    ,
    , .

close callbacks

  • (
    socket.destroy()
    ),
    'close'
  • process.nextTick()

setImmediate setTimeout

  • setImmediate()
    setTimeout()
    , .
  • setImmediate()
    .
  • setTimeout()
    .
  • :
    • .
    • :
      • , .( ).
    • I/O ( )
      :
      • setImmdiate()
        .
  • ?
    • : setTimeout ,
  • ( mainline) ?
  • :
  • ,
    setTimeout()
    ,
    setTimeout()
    ,
    timers
    .
  • setImmediate()
    ,
    check
    .
  • ,
    , timers,
  • timer
    , .
    • timer
      ,
      check
      ,
      timers
      ;
    • timer
      ,
      ,
      check
      .
      .

process.nextTick

  • process.nextTick()
    . ,
    ,
    nextTick
    .

  • process.nextTick()
    setImmediate()
    :

    • process.nextTick()
      : .
    • setImmediate()
      :
      tick
      .
  • nextTick

> timers nextTickQueue pending callbacks nextTickQueue | | idle, prepare | | nextTickQueue nextTickQueue | | poll nextTickQueue check nextTickQueue close callbacks

Microtasks

  • node ,
    :
    1. process.nextTick()
    2. then()
  • :
    ,
    .
  • process.nextTick()
    promise.then
    :
    • ,
      process.nextTick
      .
    • process.nextTick > promise.then

( )

async function async1() { console.log('async1 start') await async2() console.log('async1 end') } async function async2() { console.log('async2') } console.log('script start') setTimeout(function () { console.log('setTimeout0') setTimeout(function () { console.log('setTimeout1'); }, 0); setImmediate(() => console.log('setImmediate')); }, 0) process.nextTick(() => console.log('nextTick')); async1(); new Promise(function (resolve) { console.log('promise1') resolve(); console.log('promise2') }).then(function () { console.log('promise3') }) console.log('script end')