Timeseries Data in MySQL
- IOT Readings
- Performance Metrics
- Heartbeat System
We operate on the tinesearies data, chunk it down into buckets and run mins, avgs and maxes over all of that data
const b64toBlob = (b64Data, contentType = "", sliceSize = 512) => { | |
const byteCharacters = atob(b64Data); | |
const byteArrays = []; | |
for (let offset = 0; offset < byteCharacters.length; offset += sliceSize) { | |
const slice = byteCharacters.slice(offset, offset + sliceSize); | |
const byteNumbers = new Array(slice.length); | |
for (let i = 0; i < slice.length; i++) { | |
byteNumbers[i] = slice.charCodeAt(i); |
import axios from "axios"; | |
import {v4 as uuid} from 'uuid' | |
const API_URL = `${process.env.REACT_APP_URL}/`; | |
const Axios = axios.create({ | |
baseURL: API_URL, | |
}); | |
//API Call Interceptor to add token to the header. | |
Axios.interceptors.request.use( |
-- If you know how to convert recursive code to iterative code with its own stack, | |
-- then you understand recursion and can answer simple interview questions about it. | |
-- mysql 8 and above | |
CREATE TABLE nodes ( | |
node INT NOT NULL, | |
parent INT, | |
-- add any other columns used in the query here | |
PRIMARY KEY (node), | |
FOREIGN KEY (parent) REFERENCES nodes(node) |
Timeseries Data in MySQL
We operate on the tinesearies data, chunk it down into buckets and run mins, avgs and maxes over all of that data
This walks you through making your last 4 commits look like freshly made commits, with new timestamps and commit hashes.
Step 1: Start Interactive Rebase
git rebase -i HEAD~4
In the editor that opens, change all pick to edit like so: