Welcome to the Linux Foundation Forum!

read a directory and write the output into a file using streams

Hello,

As a small personal challenge i am trying to use streams for:

1.read a directory (__dirname in here)
2.transform the output as upperCase
3.write the transfomed output into a file

I found this solution, but i doubt that it is the best practice... May someone confirm ? or correct ?

const fs = require('fs')
const { Transform, pipeline, Readable } = require('stream')

const streamUpper = () => {
        return new Transform({
                transform(chunk, enc, next) {
                        next(null, chunk.toString().toUpperCase())
                }   
        })  
}

async function readDirStream() {
        const dir = await fs.promises.opendir(__dirname)

        for await (const dirent of dir) {
                await require('util').promisify(pipeline)(
                        Readable.from(dirent.name+'\n'),
                        streamUpper(),
                        fs.createWriteStream('./fileList.txt', {flags:'a'})
                )           
        }   
}
readDirStream().catch(e => console.error(e))

Comments

  • djedje
    djedje Posts: 20

    I just notice that the Readable.from() (line 17) is useless, that is enought

    await require('util').promisify(pipeline)(
                            dirent.name+'\n',
                            streamUpper(),
                            fs.createWriteStream('./fileList.txt', {flags:'a'})
      )     
    

    So it makes me try the course exemple "Reading Directories (.cont)", as there is the same Readable.from() consuming the output from opendir(). And it works without (Readable.from()) as well. I guess it makes sense as opendir() return already a stream, am i right ?

    exemple from the course without Readable.from()

    opendir(__dirname, (err, dir) => {
        if (err) {
          res.statusCode = 500
          res.end('Server Error')
          return
        }
        // const dirStream = Readable.from(dir)
        const entryStream = createEntryStream()
        res.setHeader('Content-Type', 'application/json')
        pipeline(dir, entryStream, res, (err) => {
          if (err) console.error(err)
        })
      })
    
    • Avoid doing inline require - always put require at the top
    • Readable.from is a convenient way to turn a data structure (string, array) into a readable stream
    • It's not needed in your directory processing example, and really neither is the transform stream
    • A better example to use with your transform stream would be a TCP socket:
    const { Transform, pipeline } = require('stream')
    
    const streamUpper = () => {
      return new Transform({
        transform(chunk, enc, next) {
            next(null, chunk.toString().toUpperCase())
        }   
      })
    }
    
    const { createServer } = require('net')
    
    createServer((socket) => {
      pipeline(socket, streamUpper(), socket, (err) => {
        if (err) console.error(err)
        else console.log('socket closed')
      })
    }).listen(9999)
    

    Readable.from would be useful for testing your stream:

    const { Transform, Readable } = require('stream')
    const assert = require('assert').strict
    const streamUpper = () => {
      return new Transform({
        transform(chunk, enc, next) {
            next(null, chunk.toString().toUpperCase())
        }   
      })
    }
    
    const upper = streamUpper()
    Readable.from(['test-a', 'test-b']).pipe(upper)
    upper.once('data', (first) => {
      assert.equal(first, 'TEST-A')
      upper.once('data', (second) {
        assert.equal(first, 'TEST-B')
      })
    })
    
    
  • djedje
    djedje Posts: 20

    Hello David,

    Thank you for the answer, i will work on that. Thank you.

Categories

Upcoming Training