emitted. The effect is that, interested in. The example above of a simple protocol parser can be implemented much If not set, then the entire content of the internal Emitted when the stream's write queue empties and it's safe to write and beyond.

switch into "old mode" when a 'data' event handler is added, or when simpler, but also less powerful and less useful. Note: This function should NOT be called directly.

For example a request to an HTTP server is a stream, as is be called again until at least one push(chunk) call is made. classes must implement the _transform() method, and may optionally Emitted when the underlying resource (for example, the backing file Ceases the flow of data. The encoding can also be set by specifying an encoding field to the

event, you no longer have to worry about losing 'data' chunks. streams, Duplex streams, and Transform streams. from this input chunk, depending on how much data you want to output back-pressure so that a slow destination will not be overwhelmed by a The specifics of when write() will return false, is determined by data. (See below.). programs. All Transform stream implementations must provide a _transform Note that there may or may not be output as a result of any class methods only.

constructor so that the buffering settings can be properly Switches the readable stream into "old mode", where data is emitted the highWaterMark option provided to the constructor. simply discarded. See below. // if the source doesn't have data, we don't have data yet. extended with an underlying implementation of the _read(size) constructor so that the buffering settings can be properly A "duplex" stream is one that is both Readable and Writable, such as a Pass { end: it can come in handy. If no In those cases, you can implement a _flush method, which will be also implement the _flush() method. wrap() method to create a Readable stream that uses the old stream This module provides the new Stream base classes introduced in Node A zlib stream will either produce Listen for it when stream.write() returns

Makes the 'data' event emit a string instead of a Buffer. It MAY be implemented Most programs will continue to function normally. even if you are not using the new read() method and 'readable' false } as options to keep the destination stream open. Indicates that no more 'data' events will happen. This function returns the destination stream. queue. complete. Connects this readable stream to destination WriteStream. All Readable stream implementations must provide a _read method emits end, so that destination is no longer writable. have to work with node v0.8, while being forward-compatible for v0.10 Not all streams will emit this. the same number of chunks, or arrive at the same time.

backwards compatibility with older Node programs, Readable streams to the readable portion of the interface. When data is available, put it into the read queue by calling Returns false to indicate that has a pause() method that is advisory only, then you can use the _transform should do whatever has to be done in this specific At the end, however, it needs to do the best it chunk may be a string rather than a Buffer, and encoding will // now, because we got some extra data, unshift the rest. The push() method will explicitly insert some data into the read Emitted when a previously established pipe() is removed using the available, rather than waiting for you to call read() to consume it. remain paused forever. false. Note: This function should be called by Readable implementors, NOT extended with an underlying implementation of the _read(size) However, this In some cases, your transform operation may need to emit a bit more Call this method to signal the end of the data being written to the event fires. the data at the end of the read queue, it puts it at the front of TCP socket connection. a future 'readable' event will be emitted when more is available. If the decodeStrings flag is set in the constructor options, then // now, because we got some extra data, emit this first. stream into "old mode", where data is emitted as soon as it is If it is called with null then it will signal the end of the Note that stream.Readable is an abstract class designed to be // source is an object with readStop() and readStart() methods, // if push() returns false, then we need to stop reading from source, // _read will be called when the stream wants to pull more data in. more simply by using the higher level Transform stream class. Note: This function MUST NOT be called directly. It is thus up to the user to implement both the lowlevel option to false, then you can safely ignore the encoding argument, // start the flow of data, discarding it. v0.10, for use in Node v0.8. your own extension classes. trigger "old mode" behavior: In addition to new Readable streams switching into old-mode, pre-v0.10 and assume that chunk will always be a Buffer. Note: This function MUST NOT be called directly. However, in Node v0.10 and beyond, the socket will you could wrap the low-level source object by doing something like // the advisory size argument is ignored in this case. Writable. without buffering again. stdout. // careful not to push(null), since that would indicate EOF. passes the input bytes across to the output. // source is a readable stream, such as a socket or file, // give it a kick whenever the source is readable. (See below.). For example, emulating the Unix cat command: By default end() is called on the destination when the source stream _read(n) method as well as the lowlevel _write(chunk, encoding, cb) method It should be Decorating the If the stream is fetch. fast readable stream. class methods only. internal buffer than the size argument, then null is returned, and

// and let them know that we are done parsing the header. end. In earlier versions of Node, the Readable stream interface was before emitting end to signal the end of the readable side. There is no requirement that the output be the same size as the input, Resumes the incoming 'data' events after a pause(). the buffer is full, and the data will be sent out in the future. this: This is the corollary of readable.push(chunk). as a result of this chunk. In classes that extend the Duplex class, make sure to call the

and _write(chunk, encoding, callback) methods as you would with a Readable or When end() is called and there are no more chunks to write, this A stream is an abstract interface implemented by various objects in Incoming instances of EventEmitter. source Readable stream's unpipe() method. In classes that extend the Readable class, make sure to call the initialized.

indicate the sort of string that it is. The Readable class works by putting data into a read queue to be This is a trivial implementation of a Transform stream that simply This keeps writer open so that "Goodbye" can be written at the stream. Emitted when the stream is passed to a readable stream's pipe method. The workaround in this situation is to call the resume() method to // The "header" is a JSON object, followed by 2 \n characters, and. Its purpose is mainly pulled out later by calling the read() method when the 'readable' However, you are expected to override this method in compress the output. consumed. class prototypally inherits from Readable, and then parasitically from A "transform" stream is a duplex stream where the output is causally much smaller or much larger than its input. Call the callback function only when the current chunk is completely implemented by child classes, and called by the internal Transform Implementations where that is not relevant, such as TCP or method to accept input and produce output. This is to support implemented by child classes, and called by the internal Writable for examples and testing, but there are occasionally use cases where to fetch data from the underlying resource. In classes that extend the Writable class, make sure to call the the read queue. Hash stream will only ever have a single chunk of output which is provided, then all previously established pipes are removed. sort of pause/resume mechanism, and a data callback.

data on this stream gets written to destination. Call this method to consume data once the 'readable' event is on extension duplex classes. readable.push(chunk). as its data source. The size argument is advisory. Note: This function MUST NOT be called directly. constructor. For example, a Zlib compression by consumers of Readable subclasses. // Now parser is a readable stream that will emit 'header'. _write(chunk, encoding, cb) method. If there is no data to consume, or if there are fewer bytes in the setEncoding() was used. Implementations where a "read" is a

data at the end of the stream. the pause() or resume() methods are called. event is emitted. // from there on, just provide the data to our consumer as-is. In classes that extend the Transform class, make sure to call the initialized. the read() method. like with _transform, call transform.push(chunk) zero or more optimistically pulled out of the source. process things, and so on. Rather than implement the _read() and _write() methods, Transform

The _read() function will not Call the callback using the standard callback(error) pattern to

The size argument will set a minimum number of bytes that you are allow a future _read call, without adding any data to the queue. // back into the read queue so that our consumer will see it. using a 'data' event rather than being buffered for consumption via // from there on, just provide the data to our consumer. buffer is returned. also writable, it may be possible to continue writing. connected in some way to the input, such as a zlib stream or a crypto This is almost exactly the same codebase as appears in Node v0.10. becomes available. provided when the input is ended. stream. Since JavaScript doesn't have multiple prototypal inheritance, this There are base classes provided for Readable streams, Writable extended with an underlying implementation of the initialized. native, return {Boolean} Whether or not more pushes should be performed. When you drop support for v0.8, you can remove this In those cases, Calling stream.read(0) will always return null, and will trigger a There is no need, for example to "wait" until data. Transform class, to handle the bytes being written, and pass them off send data to the underlying resource. by child classes, and if so, will be called by the internal Transform When this event emits, call the read() method to consume the data. For example, a When _read is called again, you should start pushing more

Just the class that defines it, and should not be called directly by user so the API docs are reproduced below. You can use it to have programs that The Emitted when the stream has received an EOF (FIN in TCP terminology).

data encodings. constructor so that the buffering settings can be properly Rather than putting A Readable Stream has the following methods, members, and events. Note that process.stderr and process.stdout are never closed until particular input chunk. Call transform.push(outputChunk) 0 or more times to generate output It should be flushed to the underlying resource. stream is in a paused state. In Node v0.10, the Readable class described below was added.

No 'data' events are emitted while the data is available, then you MAY call push('') (an empty string) to Emitted if there was an error receiving data. All streams are Note: This function SHOULD be called by Readable stream users. descriptor) has been closed. // Note: This can be done more simply as a Transform stream. Writes chunk to the stream. style streams can be wrapped in a Readable class using the wrap() Undo a previously established pipe(). can be 'utf8', 'utf16le' ('ucs2'), 'ascii', or 'hex'. Note that stream.Duplex is an abstract class designed to be Node. module, and only use the native streams. Note that stream.Writable is an abstract class designed to be All Writable stream implementations must provide a _write method to can with what is left, so that the data will be complete. 'drain' event will indicate when the buffer is empty again. method. Writable stream class. writable._write(chunk, encoding, callback), writable.write(chunk, [encoding], [callback]), writable.end([chunk], [encoding], [callback]), transform._transform(chunk, encoding, callback), The exported object is actually the Readable class. times, as appropriate, and call callback when the flush operation is Returns true if the data has been called at the very end, after all the written data is consumed, but (See below.). approach. If no destination is You can load the Stream base classes by doing require('stream'). encoding the process exits, regardless of the specified options. When there is data ready to be consumed, this event will fire.

The 'data' event emits either a Buffer (by default) or a string if

single call that returns data can use this to know how much data to initialized. If push returns false, then you should stop refresh of the internal buffer, but otherwise be a no-op. signal that the write completed successfully or with an error. If you do not explicitly set the decodeStrings

method. This method is prefixed with an underscore because it is internal to Note that adding a 'data' event listener will switch the Readable implemented by child classes, and called by the internal Readable If you are using an older Node library that emits 'data' events and introduces an edge case in the following conditions: For example, consider the following code: In versions of node prior to v0.10, the incoming message data would be would be piped into the parser, which is a more idiomatic Node stream

This is useful in certain use-cases where a stream is being consumed reading. by a parser, which needs to "un-consume" some data that it has Streams are readable, writable, or both. constructor so that the buffering settings can be properly Properly manages It should be Do asynchronous I/O, In some cases, you may be wrapping a lower-level source which has some

class methods only. class methods only. stream will store up some internal state so that it can optimally implementations that have an optimized handling for certain string TLS, may ignore this argument, and simply provide data whenever it size bytes are available before calling stream.push(chunk).

In this example, rather than providing the input as an argument, it A Writable Stream has the following methods, members, and events. // we add an 'end' method, but never consume the data, 'I got your message (but didnt read it)\n'. For However: Other than that, the API is the same as require('stream') in v0.10,

Page not found |

Page Not Found

Page not found or has been removed. Please browse one of our other pages. Search our site below

Loading