-
-
Save kristopherjohnson/5065599 to your computer and use it in GitHub Desktop.
#!/usr/bin/env node | |
// Reads JSON from stdin and writes equivalent | |
// nicely-formatted JSON to stdout. | |
var stdin = process.stdin, | |
stdout = process.stdout, | |
inputChunks = []; | |
stdin.resume(); | |
stdin.setEncoding('utf8'); | |
stdin.on('data', function (chunk) { | |
inputChunks.push(chunk); | |
}); | |
stdin.on('end', function () { | |
var inputJSON = inputChunks.join(), | |
parsedData = JSON.parse(inputJSON), | |
outputJSON = JSON.stringify(parsedData, null, ' '); | |
stdout.write(outputJSON); | |
stdout.write('\n'); | |
}); |
Array.join() with no arguments adds commas between elements, which corrupts the input. inputChunks.join("") works.
Perfect. Exactly what I was looking for
Thanks for this gist. I learned this while studying it:
process.stdin.resume()
should not be used in versions of Node greater or equal to v0.10. Comments about this from http://nodejs.org/api/process.html#process_process_stdin :
As a Stream, process.stdin can also be used in "old" mode that is compatible with scripts written for node prior v0.10. For more information see Stream compatibility.
In "old" Streams mode the stdin stream is paused by default, so one must call process.stdin.resume() to read from it. Note also that calling process.stdin.resume() itself would switch stream to "old" mode.
If you are starting a new project you should prefer a more recent "new" Streams mode over "old" one.
Make inputChunks a string.
stdin.on('data', function (chunk) {
inputChunks += chunk;
});
Then you don't need to join.
WARNING: it inserts "," between the chunks.
to fix this replace
var inputJSON = inputChunks.join(),
with
var inputJSON = inputChunks.join(""),
@Roam-Cooper, I think the join approach is better as it potentially optimizes the concatenation.
Using the double quotes in the join()
also fixed this error I was getting:
'SyntaxError: Unexpected token , in JSON at position 65536'
Nodejs readline
native utility can be helpful to split the lines if you are streaming line by line.
'is what i am saying
Thanks for gist 💪.
Here's a terser version that uses some of the included fixes:
let inputJson = ''
process.stdin.setEncoding('utf8')
process.stdin.on('data', function (chunk) {
inputJson += chunk
})
process.stdin.on('end', function () {
const parsedData = JSON.parse(inputJson)
process.stdout.write(JSON.stringify(parsedData))
})
Yet another updated and terser version, ready to edit and run!
function readStdin() {
return new Promise(resolve => {
let buf = ''
process.stdin.setEncoding('utf8')
process.stdin.on('data', chunk => (buf += chunk))
process.stdin.on('end', () => resolve(buf))
})
}
async function main() {
const d = JSON.parse(await readStdin())
// do stuff with d, then...
console.log(JSON.stringify(d, null, 2))
}
Here's a oneliner that does the same thing (at least in mac and linux land, not sure about windows):
node -p 'JSON.stringify(JSON.parse(fs.readFileSync(0)),null,2)'
notably, JSON.parse
can work off of a buffer and fs.readFileSync(0)
reads the zero file descriptor, which is standard input.
Then, node -p
is a way to execute and log the output from a statement. You could also write it with a node -e 'console.log(...)'
if you would rather be in control of when or how the logging happens.
See An OS X Service for Reformatting JSON Text for more info.