chapter 12 Readable stream, how we control the read() function
On chapter 12 Streams
,
on the Readable streams
, there is an example for the Readable
stream:
```'use strict' const { Readable } = require('stream') const createReadStream = () => { // what if the `data` is a looooooong serialized db or a 100000 length array const data = ['some', 'data', 'to', 'read'] return new Readable({ read () { if (data.length === 0) this.push(null) else this.push(data.shift()) } }) } const readable = createReadStream() readable.on('data', (data) => { console.log('got data', data) }) readable.on('end', () => { console.log('finished reading') })
```
There is a condition, in the read()
function, that we extract the last item of data
array, and when there are no more items left, we push null so to emit the end
event.
I have some questions regarding that approach, and i d like to know how we will approach it in some other cases, like:
- Is this the right approach to monitor the remaining data of an array? By extracting an array item each time, until there are no more ?
- Also, from performance view, if that array has 10000, we ll emit the 'data' event, 10000 times!
- How are we monitoring the remaining data, if the
data
is a large serialised database(string). What condition should we put, intoread()
so to know when to emit the 'data' and when the 'end' event ?
thank you
Best Answer
-
hey @theodoros
Code is always about context, performance isn't always priority #1 - and this is coming from someone who has written, spoken and consulted extensively around performance in Node. This code is optimized for communication, for teaching the general concepts and API of streams. With that in mind:
- Typically readable streams are for connecting with some kind of IO, transmitting data isn't a big use-case beyond test code and example code. A better way to do this is outside of explaining the API is to just use
Readable.from(array)
and you have your readable stream emitting data, then there's no need to be concerned about the details. - Performance isn't a concern here, in fact any time you emit in-memory data from a stream (e.g. in tests) performance tends not to be a concern. One a side note though, streams improve performance for I/O scenarios, particular where you have a large amount of data - they do not improve CPU compute performance. By regulating and processing incremental data, they support an optimal pattern for handling I/O in specific circumstances.
- That depends entirely on context. Consider TCP, it's a protocol with the ability to indicate (among other things) connecting and disconnecting. A stream around TCP (e.g. a
net
socket) would know when to end based on a protocol instruction. If a database supports streaming, its drivers will know how to interpret end of stream, and a streaming implementation around those drivers would take that instruction and turn it into apush(null)
to end the stream
@krave for your questions
- The default high watermarks of 16kb (write) and 64kb (read) tend to be fine, beyond that its a fine tuning exercise that's highly dependant on the context
- That's a huge topic, probably the most trivial approach would be a stream wrapper around an existing streaming media processor, e.g. ffmpeg - This project looks interesting: https://github.com/amishshah/prism-media
0 - Typically readable streams are for connecting with some kind of IO, transmitting data isn't a big use-case beyond test code and example code. A better way to do this is outside of explaining the API is to just use
Answers
-
Hi, @theodoros , I would like to join this conversion because I have relavent confusion too.
Some thoughts about your questions
- Keep an index pointing to where the last item has been read is is my approach to do such tasks. I think that would be more performant.
- The size of each push can be under your control. For example, you can push 10 items each time.
- So as to the scenarios of strings, I will slice the large string into pieces and keep a record of the index from which the stream read last time. Then increment the index increasingly. If the index points out of the string, then I will stop right away. Here is my code.
'use strict' const { Readable } = require('stream') const createReadStream = () => { // what if the `data` is a looooooong serialized db or a 100000 length array const data = '123456789' let index = 0 const step = 6 return new Readable({ read() { if (data.length < index) { this.push(null) } else { this.push(data.slice(index, index + step)) index = index + step } } }) } const readable = createReadStream() readable.on('data', (data) => { console.log('got data:', data.toString()) }) readable.on('end', () => { console.log('finished reading') })
My questions:
1. How big the appropriate size of the chunk should be? I refer to something like thestep
in my code above particularly when chunks are being sent over network.
2. How to stream video data? For example live video streaming. Is there any great references or tutorials?0 -
Oh, didn't know that project before. Thanks!
0 -
np
0
Categories
- All Categories
- 217 LFX Mentorship
- 217 LFX Mentorship: Linux Kernel
- 788 Linux Foundation IT Professional Programs
- 352 Cloud Engineer IT Professional Program
- 177 Advanced Cloud Engineer IT Professional Program
- 82 DevOps Engineer IT Professional Program
- 146 Cloud Native Developer IT Professional Program
- 137 Express Training Courses
- 137 Express Courses - Discussion Forum
- 6.1K Training Courses
- 46 LFC110 Class Forum - Discontinued
- 70 LFC131 Class Forum
- 42 LFD102 Class Forum
- 226 LFD103 Class Forum
- 18 LFD110 Class Forum
- 36 LFD121 Class Forum
- 18 LFD133 Class Forum
- 7 LFD134 Class Forum
- 18 LFD137 Class Forum
- 71 LFD201 Class Forum
- 4 LFD210 Class Forum
- 5 LFD210-CN Class Forum
- 2 LFD213 Class Forum - Discontinued
- 128 LFD232 Class Forum - Discontinued
- 2 LFD233 Class Forum
- 4 LFD237 Class Forum
- 24 LFD254 Class Forum
- 693 LFD259 Class Forum
- 111 LFD272 Class Forum
- 4 LFD272-JP クラス フォーラム
- 12 LFD273 Class Forum
- 144 LFS101 Class Forum
- 1 LFS111 Class Forum
- 3 LFS112 Class Forum
- 2 LFS116 Class Forum
- 4 LFS118 Class Forum
- 4 LFS142 Class Forum
- 5 LFS144 Class Forum
- 4 LFS145 Class Forum
- 2 LFS146 Class Forum
- 3 LFS147 Class Forum
- 1 LFS148 Class Forum
- 15 LFS151 Class Forum
- 2 LFS157 Class Forum
- 25 LFS158 Class Forum
- 7 LFS162 Class Forum
- 2 LFS166 Class Forum
- 4 LFS167 Class Forum
- 3 LFS170 Class Forum
- 2 LFS171 Class Forum
- 3 LFS178 Class Forum
- 3 LFS180 Class Forum
- 2 LFS182 Class Forum
- 5 LFS183 Class Forum
- 31 LFS200 Class Forum
- 737 LFS201 Class Forum - Discontinued
- 3 LFS201-JP クラス フォーラム
- 18 LFS203 Class Forum
- 130 LFS207 Class Forum
- 2 LFS207-DE-Klassenforum
- 1 LFS207-JP クラス フォーラム
- 302 LFS211 Class Forum
- 56 LFS216 Class Forum
- 52 LFS241 Class Forum
- 48 LFS242 Class Forum
- 38 LFS243 Class Forum
- 15 LFS244 Class Forum
- 2 LFS245 Class Forum
- LFS246 Class Forum
- 48 LFS250 Class Forum
- 2 LFS250-JP クラス フォーラム
- 1 LFS251 Class Forum
- 150 LFS253 Class Forum
- 1 LFS254 Class Forum
- 1 LFS255 Class Forum
- 7 LFS256 Class Forum
- 1 LFS257 Class Forum
- 1.2K LFS258 Class Forum
- 10 LFS258-JP クラス フォーラム
- 118 LFS260 Class Forum
- 159 LFS261 Class Forum
- 42 LFS262 Class Forum
- 82 LFS263 Class Forum - Discontinued
- 15 LFS264 Class Forum - Discontinued
- 11 LFS266 Class Forum - Discontinued
- 24 LFS267 Class Forum
- 22 LFS268 Class Forum
- 30 LFS269 Class Forum
- LFS270 Class Forum
- 202 LFS272 Class Forum
- 2 LFS272-JP クラス フォーラム
- 1 LFS274 Class Forum
- 4 LFS281 Class Forum
- 9 LFW111 Class Forum
- 259 LFW211 Class Forum
- 181 LFW212 Class Forum
- 13 SKF100 Class Forum
- 1 SKF200 Class Forum
- 1 SKF201 Class Forum
- 795 Hardware
- 199 Drivers
- 68 I/O Devices
- 37 Monitors
- 102 Multimedia
- 174 Networking
- 91 Printers & Scanners
- 85 Storage
- 758 Linux Distributions
- 82 Debian
- 67 Fedora
- 17 Linux Mint
- 13 Mageia
- 23 openSUSE
- 148 Red Hat Enterprise
- 31 Slackware
- 13 SUSE Enterprise
- 353 Ubuntu
- 468 Linux System Administration
- 39 Cloud Computing
- 71 Command Line/Scripting
- Github systems admin projects
- 93 Linux Security
- 78 Network Management
- 102 System Management
- 47 Web Management
- 63 Mobile Computing
- 18 Android
- 33 Development
- 1.2K New to Linux
- 1K Getting Started with Linux
- 370 Off Topic
- 114 Introductions
- 173 Small Talk
- 22 Study Material
- 805 Programming and Development
- 303 Kernel Development
- 484 Software Development
- 1.8K Software
- 261 Applications
- 183 Command Line
- 3 Compiling/Installing
- 987 Games
- 317 Installation
- 96 All In Program
- 96 All In Forum
Upcoming Training
-
August 20, 2018
Kubernetes Administration (LFS458)
-
August 20, 2018
Linux System Administration (LFS301)
-
August 27, 2018
Open Source Virtualization (LFS462)
-
August 27, 2018
Linux Kernel Debugging and Security (LFD440)