Skip to content

Instantly share code, notes, and snippets.

View charlieroberts's full-sized avatar

charlie roberts charlieroberts

View GitHub Profile
@charlieroberts
charlieroberts / genish.sine.demo.html
Last active August 20, 2016 15:22
A short demo of using genish.js in an HTML file with a scriptprocessor node.
<!--
This file demonstrates how to create an audiocontext and scriptprocessor node to use with genish.js
that plays a sine wave.
-->
<html>
<head>
<title>genish.js ScriptProcessor Sine demo</title>
<script src='http://www.charlie-roberts.com/genish/dist/gen.lib.js'></script>
</head>
# Prototype inheritance
Simply put, prototypical inheritance means that any object can delegate property lookups and method calls to any other object.
Prototypical inheritance is also used in Lua, AppleScript, and many other languages have add-ons / extensions to support it.
It was first used in the language *Self*.
```js
var a = { test: function() { console.log( 'I am testing' ) } }
// create a new object with an argument prototype

Prototype inheritance

Simply put, prototypical inheritance means that any object can delegate property lookups and method calls to any other object. Prototypical inheritance is also used in Lua, AppleScript, and many other languages have add-ons / extensions to support it. It was first used in the language Self.

var a = { test: function() { console.log( 'I am testing' ) } }

// create a new object with an argument prototype

What browser sends to Live:

  • Global “get_scene” Request the current LOM
    “select_parameter param_id”
    “select_track track_id” Select a particular track (by id)

  • Send MIDI notes (pitch-velocity-duration) (?pitch bend etc.?) to a track’s device
    “Track_id add beat phase pitch N”
    “Track_id add beat phase velocity N”

@charlieroberts
charlieroberts / gibberwocky.max.websocket.spec.md
Last active May 25, 2017 19:28
gibberwocky.max communication spec

gibberwocky communicates using the websocket protocol with a max4live plugin. Theoretically any client should be able to interface with the plugin using this websocket interface.

Everytime the Abelton timeline advances a beat, the max4live plugin sends a request to the gibberwocky client for the subsequent beat's events. For example, on beat 3, the plugin will send a request for all events that should occur during beat 4. This one beat of latency ensures that all messages arrive with plenty of time to spare. Alternative interfaces to the gibberwocky client could ignore these messages and implement their own timing strategies (using Ableton Link or MIDI clock etc.)

Read more about the project here: http://www.charlie-roberts.com/pubs/Live_Coding_DAW.pdf

What browser sends to Live:

Many messages below accept a beat / phase for scheduling messages in the future. If you want to simply trigger a message immediately,

@charlieroberts
charlieroberts / genish_vs_waapi_benchmarks.html
Last active April 21, 2022 02:58
Benchmark comparisons of common synthesis tasks between Genish.js and the Web Audio API, for the 2017 Web Audio Conference
<!doctype html>
<html>
<head>
<!-- npm install mathjs [email protected] -->
<script src='./node_modules/genish.js/dist/gen.lib.js'></script>
<script src='./node_modules/mathjs/dist/math.js'></script>
</head>
<body></body>
<script>
if( typeof OfflineAudioContext === 'undefined' ) window.OfflineAudioContext = webkitOfflineAudioContext
@charlieroberts
charlieroberts / gibber-tidal.markdown
Last active September 25, 2017 00:50
An proposed API for using the Tidal pattern language in Gibber

Gibber-Tidal

This is a proposed API for using the Tidal pattern language (https://tidalcycles.org/patterns.html) in Gibber. The basic idea is extend Gibber to support calls to .tidal() in addition to its current .seq() method. This would enable any audiovisual property or method to be controlled using Tidal patterns.

// create a synth and sequence it
// using indices into a global scale
a = Synth()
@charlieroberts
charlieroberts / lecture7_igm330_02_webaudio.markdown
Last active September 19, 2017 03:56
lecture notes on the fundamentals of the web audio API and analysis for IGM 330.02, Fall 2017

Web Audio API

Basics

  • Two versions, one by Mozilla and one by Chrome

    • Do DSP in JavaScript (via per-sample callbacks) vs do DSP using pre-built C++ nodes scripted from JavaScript.
      • Chrome wins! DSP is mainly done using built-in C++ nodes and controlling them via JS
  • Nodes are assembled into a graph. Nodes are also called

@charlieroberts
charlieroberts / Tidal PEG.js
Last active September 24, 2017 18:37
Start of a PEG for Tidal
/*
working towards a gibber-tidal integration, a la:
https://gist.github.com/charlieroberts/445f2fb1ce3555f04cffc334c2be0b42
test here: https://pegjs.org/online
for more info on pegs and music languages:
https://worldmaking.github.io/workshop_nime_2017/editor.html
currently turns [ 3/4 2(5,8) [0 1 2 3*2] ]