I am using the fantastic jq to manipulate a REST API's json, into a csv for upsertting into another database system, via its API. Once in the target system, I'm doing some date math, including rounding time stamps to 30 min intervals to allow me to do group them, for standard deviation calculation. The problem is, there is a lot of data and the database chokes when it has to round every record and then do std dev calculations on each.
- Sample.json shows a small sample of what the input data looks like, but actually it's just stdout from the curl command in the shell script.
- example-initial.sh shows the important bit of the initial shell script, that uses curl to authenticate against the data location's API, and then use jq to add a couple of columns and export to csv. It works like a charm.
- ACME-X1-A.csv is an example of the output CSV file, that's then upsertted into the target db.