0

I have a filepath that leads to a .txt file that has a number of objects in it. I'm trying to write a JavaScript function that will take in this filepath as an argument and allow me to access and iterate over these objects, but everything I've tried and found online doesn't work. Is there a technique to accomplish this task?

I'm just trying to in vs code. The contents of the .txt file are:

{"food": "chocolate", "eaten", true}
{"food": "hamburger", "eaten", false}
{"food": "peanuts", "eaten", true}
{"food": "potato", "eaten", true}

I tried just iterating over the file path as an argument but that didn't work and it just returned the file path itself, and I have had no luck with any of the read file solutions on this site.

I know in Ruby this is easily accomplishable through:

File.open("my/file/path", "r") do |f|
  f.each_line do |line|
    puts line
  end
end

But I am confused about the JavaScript solution.

6
  • Could you share the content of the .txt? Commented Apr 7, 2018 at 4:24
  • 3
    Please edit, write what you tried and what was the error message you got. What environment you intend to use, node.js or browser or something else? Commented Apr 7, 2018 at 4:25
  • Also, what is that format? It looks like JSON but it is not valid. Commented Apr 7, 2018 at 4:29
  • it's just objects from a .txt file. in the text editor they are all grey. Commented Apr 7, 2018 at 4:30
  • at least i think it's objects. Commented Apr 7, 2018 at 4:30

2 Answers 2

1
const fs = require('fs');
fs.readFile('txtFilePath', 'utf8', (err, data) => {
  const toValidJSON = data.replace(/"eaten",/g, '"eaten":').replace(/\}[\r\n]+\{/g, '},{');
  const validJSON = `[${toValidJSON}]`
  const arr = JSON.parse(validJSON);
  console.log(arr)
});

for this question only

Sign up to request clarification or add additional context in comments.

Comments

1

In Node.js, if you want a streaming approach, extend a Transform stream to parse JSON between line separators:

const { Transform } = require('stream')

module.exports = class DelimitedJSONTransform extends Transform {
  constructor ({ delimiter = '\n', encoding = 'utf8', reviver = null } = {}) {
    super({ readableObjectMode: true })
    this._delimiter = delimiter
    this._encoding = encoding
    this._reviver = reviver
    this._buffer = ''
  }

  _transform (chunk, encoding, callback) {
    switch (encoding) {
    case 'buffer':
      this._buffer += chunk.toString(this._encoding)
      break
    default:
      this._buffer += chunk
      break
    }

    const lines = this._buffer.split(this._delimiter)
    const latest = lines.pop()

    try {
      while (lines.length > 0) {
        this.push(JSON.parse(lines.shift(), this._reviver))
      }

      callback()
    } catch (error) {
      callback(error)
    } finally {
      lines.push(latest)
      this._buffer = lines.join(this._delimiter)
    }
  }

  _flush (callback) {
    if (!this._buffer.trim()) {
      return
    }

    const lines = this._buffer.split(this._delimiter)

    try {
      while (lines.length > 0) {
        this.push(JSON.parse(lines.shift(), this._reviver))
      }

      callback()
    } catch (error) {
      callback(error)
    }
  }
}

Usage

const { createReadStream } = require('fs')
const DelimitedJSONTransform = require('./transform') // or whatever you named the file above

let fileStream = createReadStream('jsons.txt')
let jsonTransform = fileStream.pipe(new DelimitedJSONTransform())

jsonTransform
  .on('data', object => { console.log(object) })
  .on('error', error => { console.error(error) })

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.