Simple demo app that uses nodejs streams to ingest a CSV data file, transform it to NDJson or JSON, perform additional transforms on the data itself, and then write the results to a file.
This app uses the Planetary Systems Composite table from the NASA Expolanet Archive. The data are stored in the Planetary_Systems_Complete.csv file in this repo.
Some thoughts on the thought process and generally meandering path I followed to create this hack
git clone [email protected]:BidnessForB/node-etl.git
npm install
node app.js <options>
-i, --inputFile string Input CSV file path (planetary_systems_complete.csv)
-o, --outputFile string Output file path (output.(nd)json
-f, --outputFormat string Output data format: 'json' or 'ndjson (default)'.
-m, --missingDataToken string Replace missing data with this token (MISSING).
-c, --checkTypes string Check types (true)
-e, --errorToken string Value for calculated fields with errors (ERROR)
-h, --help string Output this usage guide