Skip to content

Instantly share code, notes, and snippets.

@Kcnarf
Last active October 28, 2020 14:43
Show Gist options
  • Save Kcnarf/b2212ceafc875aac0e02a153fe9ff330 to your computer and use it in GitHub Desktop.
Save Kcnarf/b2212ceafc875aac0e02a153fe9ff330 to your computer and use it in GitHub Desktop.
Voronoï playground: animating addition/removing of data
license: mit
border: no

This block experiments how to animate the addition/removal of data in a Voronoï map.

This experimentation leads to the d3-voronoi-map-tween plugin. A more recent bl.ocks uses this plugin.

In the starting Voronoï map, red cells correspond to removed/exiting data (i.e. cells not appearing in the ending Voronoï map). In the ending Voronoï map, green cells correspond to added/entering data (i.e. cells not existing in the starting Voronoï map). Blue cells correspond to updated data, i.e. cells existing in both the starting and ending Voronoï maps. Areas of blue cells evolve because the corresponding data values evolve.

show internals gives some visual explanations on what is going on. It displays the state of each site (see below to understand what are sites and what are they used to). The radius of each site encodes the correponding datum's value (as the cells area do). In the starting voronoï map, a disk shows the starting value of a datum; in the ending Voronoï map, it shows the ending value of the datum; in an intermediate Voronoï map, it shows the interpolation value between the starting and ending values.

The algorithm is the following:

  1. compute the starting Voronoï map of the starting data set; it requires some (undisplayed) iterations, and the final tessellation allows to retrieve starting sites' coords and weights; those starting sites allow to recompute the starting Voronoï map in 1 iteration using the d3-weightedVoronoï plugin
  2. (quite similar to 1.) compute the ending Voronoï map of the ending data set; the final tessellation allows to retrieve ending sites' coords and weights; those ending sites allow to recompute the ending Voronoï map in 1 iteration using the d3-weightedVoronoï plugin.
  3. compute the interpolated Voronoï tessellation (between the starting tessellation and the ending tessellation) for a particular interpolation amount :
  4. if interpolation amount is 0, return starting tessellation (appearing green cells should be nullified)
  5. else, if interpolation amount is 1, return ending tessellation (disappearing red cells should be nullified)
  6. else, 331. interpolate each site's coords and weight (a simple LERP function is used); be careful on disappearing cells and appearing cells 332. compute the interpolated Voronoï map, using d3-weighted-voronoi with these interpolated coords and weights

User interactions :

  • use the slider to see the intermediate Voronoï maps, from the starting one (at left) to the ending one (at right).
  • show internals gives some visual explanations on what is going on, by displaying the status of each site.

Acknowledgments to :

// code from https://github.com/Kcnarf/d3-voronoi-map/commit/b3ef6984fd508b29b84f3335c5619bf34209557d
// with 1 intilization
(function (global, factory) {
typeof exports === 'object' && typeof module !== 'undefined' ? factory(exports, require('d3-polygon'), require('d3-timer'), require('d3-dispatch'), require('d3-weighted-voronoi')) :
typeof define === 'function' && define.amd ? define(['exports', 'd3-polygon', 'd3-timer', 'd3-dispatch', 'd3-weighted-voronoi'], factory) :
(factory((global.d3 = global.d3 || {}),global.d3,global.d3,global.d3,global.d3));
}(this, function (exports,d3Polygon,d3Timer,d3Dispatch,d3WeightedVoronoi) { 'use strict';
function FlickeringMitigation () {
/////// Inputs ///////
this.growthChangesLength = DEFAULT_LENGTH;
this.totalAvailableArea = NaN;
//begin: internals
this.lastAreaError = NaN;
this.lastGrowth = NaN;
this.growthChanges = [];
this.growthChangeWeights = generateGrowthChangeWeights(this.growthChangesLength); //used to make recent changes weighter than older changes
this.growthChangeWeightsSum = computeGrowthChangeWeightsSum(this.growthChangeWeights);
//end: internals
}
var DEFAULT_LENGTH = 10;
function direction(h0, h1) {
return (h0 >= h1)? 1 : -1;
}
function generateGrowthChangeWeights(length) {
var initialWeight = 3; // a magic number
var weightDecrement = 1; // a magic number
var minWeight = 1;
var weightedCount = initialWeight;
var growthChangeWeights = [];
for (var i=0; i<length; i++) {
growthChangeWeights.push(weightedCount);
weightedCount -= weightDecrement;
if (weightedCount<minWeight) { weightedCount = minWeight; }
}
return growthChangeWeights;
}
function computeGrowthChangeWeightsSum (growthChangeWeights) {
var growthChangeWeightsSum = 0;
for (var i=0; i<growthChangeWeights.length; i++) {
growthChangeWeightsSum += growthChangeWeights[i];
}
return growthChangeWeightsSum;
}
///////////////////////
///////// API /////////
///////////////////////
FlickeringMitigation.prototype.reset = function () {
this.lastAreaError = NaN;
this.lastGrowth = NaN;
this.growthChanges = [];
this.growthChangesLength = DEFAULT_LENGTH;
this.growthChangeWeights = generateGrowthChangeWeights(this.growthChangesLength);
this.growthChangeWeightsSum = computeGrowthChangeWeightsSum(this.growthChangeWeights);
this.totalAvailableArea = NaN;
return this;
};
FlickeringMitigation.prototype.clear = function () {
this.lastAreaError = NaN;
this.lastGrowth = NaN;
this.growthChanges = [];
return this;
};
FlickeringMitigation.prototype.length = function (_) {
if (!arguments.length) { return this.growthChangesLength; }
if (parseInt(_)>0) {
this.growthChangesLength = Math.floor(parseInt(_));
this.growthChangeWeights = generateGrowthChangeWeights(this.growthChangesLength);
this.growthChangeWeightsSum = computeGrowthChangeWeightsSum(this.growthChangeWeights);
} else {
console.warn("FlickeringMitigation.length() accepts only positive integers; unable to handle "+_);
}
return this;
};
FlickeringMitigation.prototype.totalArea = function (_) {
if (!arguments.length) { return this.totalAvailableArea; }
if (parseFloat(_)>0) {
this.totalAvailableArea = parseFloat(_);
} else {
console.warn("FlickeringMitigation.totalArea() accepts only positive numbers; unable to handle "+_);
}
return this;
};
FlickeringMitigation.prototype.add = function (areaError) {
var secondToLastAreaError, secondToLastGrowth;
secondToLastAreaError = this.lastAreaError;
this.lastAreaError = areaError;
if (!isNaN(secondToLastAreaError)) {
secondToLastGrowth = this.lastGrowth;
this.lastGrowth = direction(this.lastAreaError, secondToLastAreaError);
}
if (!isNaN(secondToLastGrowth)) {
this.growthChanges.unshift(this.lastGrowth!=secondToLastGrowth);
}
if (this.growthChanges.length>this.growthChangesLength) {
this.growthChanges.pop();
}
return this;
};
FlickeringMitigation.prototype.ratio = function () {
var weightedChangeCount = 0;
var ratio;
if (this.growthChanges.length < this.growthChangesLength) { return 0; }
if (this.lastAreaError > this.totalAvailableArea/10) { return 0; }
for(var i=0; i<this.growthChangesLength; i++) {
if (this.growthChanges[i]) {
weightedChangeCount += this.growthChangeWeights[i];
}
}
ratio = weightedChangeCount/this.growthChangeWeightsSum;
if (ratio>0) {
console.log("flickering mitigation ratio: "+Math.floor(ratio*1000)/1000);
}
return ratio;
};
function randomInitialPosition () {
//begin: internals
var clippingPolygon,
extent,
minX, maxX,
minY, maxY,
dx, dy;
//end: internals
///////////////////////
///////// API /////////
///////////////////////
function _random(d, i, arr, voronoiMapSimulation) {
var shouldUpdateInternals = false;
var x, y;
if (clippingPolygon !== voronoiMapSimulation.clip()) {
clippingPolygon = voronoiMapSimulation.clip();
extent = voronoiMapSimulation.extent();
shouldUpdateInternals = true;
}
if (shouldUpdateInternals) {
updateInternals();
}
x = minX + dx * voronoiMapSimulation.prng()();
y = minY + dy * voronoiMapSimulation.prng()();
while (!d3Polygon.polygonContains(clippingPolygon, [x, y])) {
x = minX + dx * voronoiMapSimulation.prng()();
y = minY + dy * voronoiMapSimulation.prng()();
}
return [x, y];
};
///////////////////////
/////// Private ///////
///////////////////////
function updateInternals() {
minX = extent[0][0];
maxX = extent[1][0];
minY = extent[0][1];
maxY = extent[1][1];
dx = maxX - minX;
dy = maxY - minY;
};
return _random;
};
function pie () {
//begin: internals
var startAngle = 0;
var clippingPolygon,
dataArray,
dataArrayLength,
clippingPolygonCentroid,
halfIncircleRadius,
angleBetweenData;
//end: internals
///////////////////////
///////// API /////////
///////////////////////
function _pie(d, i, arr, voronoiMapSimulation) {
var shouldUpdateInternals = false;
if (clippingPolygon !== voronoiMapSimulation.clip()) {
clippingPolygon = voronoiMapSimulation.clip();
shouldUpdateInternals |= true;
}
if (dataArray !== arr) {
dataArray = arr;
shouldUpdateInternals |= true;
}
if (shouldUpdateInternals) {
updateInternals();
}
// add some randomness to prevent colinear/cocircular points
// substract -0.5 so that the average jitter is still zero
return [
clippingPolygonCentroid[0] + Math.cos(startAngle + i * angleBetweenData) * halfIncircleRadius + (voronoiMapSimulation.prng()() - 0.5) * 1E-3,
clippingPolygonCentroid[1] + Math.sin(startAngle + i * angleBetweenData) * halfIncircleRadius + (voronoiMapSimulation.prng()() - 0.5) * 1E-3
];
};
_pie.startAngle = function (_) {
if (!arguments.length) {
return startAngle;
}
startAngle = _;
return _pie;
};
///////////////////////
/////// Private ///////
///////////////////////
function updateInternals() {
clippingPolygonCentroid = d3Polygon.polygonCentroid(clippingPolygon);
halfIncircleRadius = computeMinDistFromEdges(clippingPolygonCentroid, clippingPolygon) / 2;
dataArrayLength = dataArray.length;
angleBetweenData = 2 * Math.PI / dataArrayLength;
};
function computeMinDistFromEdges(vertex, clippingPolygon) {
var minDistFromEdges = Infinity,
edgeIndex = 0,
edgeVertex0 = clippingPolygon[clippingPolygon.length - 1],
edgeVertex1 = clippingPolygon[edgeIndex];
var distFromCurrentEdge;
while (edgeIndex < clippingPolygon.length) {
distFromCurrentEdge = vDistance(vertex, edgeVertex0, edgeVertex1);
if (distFromCurrentEdge < minDistFromEdges) {
minDistFromEdges = distFromCurrentEdge;
}
edgeIndex++;
edgeVertex0 = edgeVertex1;
edgeVertex1 = clippingPolygon[edgeIndex];
}
return minDistFromEdges;
}
//from https://stackoverflow.com/questions/849211/shortest-distance-between-a-point-and-a-line-segment
function vDistance(vertex, edgeVertex0, edgeVertex1) {
var x = vertex[0],
y = vertex[1],
x1 = edgeVertex0[0],
y1 = edgeVertex0[1],
x2 = edgeVertex1[0],
y2 = edgeVertex1[1];
var A = x - x1,
B = y - y1,
C = x2 - x1,
D = y2 - y1;
var dot = A * C + B * D;
var len_sq = C * C + D * D;
var param = -1;
if (len_sq != 0) //in case of 0 length line
param = dot / len_sq;
var xx, yy;
if (param < 0) { // this should not arise as clippingpolygon is convex
xx = x1;
yy = y1;
} else if (param > 1) { // this should not arise as clippingpolygon is convex
xx = x2;
yy = y2;
} else {
xx = x1 + param * C;
yy = y1 + param * D;
}
var dx = x - xx;
var dy = y - yy;
return Math.sqrt(dx * dx + dy * dy);
}
return _pie;
}
function halfAverageAreaInitialWeight () {
//begin: internals
var clippingPolygon,
dataArray,
siteCount,
totalArea,
halfAverageArea;
//end: internals
///////////////////////
///////// API /////////
///////////////////////
function _halfAverageArea(d, i, arr, voronoiMapSimulation) {
var shouldUpdateInternals = false;
if (clippingPolygon !== voronoiMapSimulation.clip()) {
clippingPolygon = voronoiMapSimulation.clip();
shouldUpdateInternals |= true;
}
if (dataArray !== arr) {
dataArray = arr;
shouldUpdateInternals |= true;
}
if (shouldUpdateInternals) {
updateInternals();
}
return halfAverageArea;
};
///////////////////////
/////// Private ///////
///////////////////////
function updateInternals() {
siteCount = dataArray.length;
totalArea = d3Polygon.polygonArea(clippingPolygon);
halfAverageArea = totalArea / siteCount / 2; // half of the average area of the the clipping polygon
}
return _halfAverageArea;
};
function voronoiMapSimulation(data) {
//begin: constants
var DEFAULT_CONVERGENCE_RATIO = 0.01;
var DEFAULT_MAX_ITERATION_COUNT = 50;
var DEFAULT_MIN_WEIGHT_RATIO = 0.01;
var DEFAULT_PRNG = Math.random;
var DEFAULT_INITIAL_POSITION = randomInitialPosition();
var DEFAULT_INITIAL_WEIGHT = halfAverageAreaInitialWeight();
var RANDOM_INITIAL_POSITION = randomInitialPosition();
var epsilon = 1;
//end: constants
/////// Inputs ///////
var weight = function(d) {
return d.weight;
}; // accessor to the weight
var convergenceRatio = DEFAULT_CONVERGENCE_RATIO; // targeted allowed error ratio; default 0.01 stops computation when cell areas error <= 1% clipping polygon's area
var maxIterationCount = DEFAULT_MAX_ITERATION_COUNT; // maximum allowed iteration; stops computation even if convergence is not reached; use a large amount for a sole converge-based computation stop
var minWeightRatio = DEFAULT_MIN_WEIGHT_RATIO; // used to compute the minimum allowed weight; default 0.01 means 1% of max weight; handle near-zero weights, and leaves enought space for cell hovering
var prng = DEFAULT_PRNG; // pseudorandom number generator
var initialPosition = DEFAULT_INITIAL_POSITION; // accessor to the initial position; defaults to a random position inside the clipping polygon
var initialWeight = DEFAULT_INITIAL_WEIGHT; // accessor to the initial weight; defaults to the average area of the clipping polygon
//begin: internals
var weightedVoronoi = d3WeightedVoronoi.weightedVoronoi(),
flickeringMitigation = new FlickeringMitigation(),
shouldInitialize = true, // should initialize due to changes via APIs
siteCount, // number of sites
totalArea, // area of the clipping polygon
areaErrorTreshold, // targeted allowed area error (= totalArea * convergenceRatio); below this treshold, map is considered obtained and computation stops
iterationCount, // current iteration
polygons, // current computed polygons
areaError, // current area error
converged, // true if (areaError < areaErrorTreshold)
ended; // stores if computation is ended, either if computation has converged or if it has reached the maximum allowed iteration
//end: internals
//being: internals/simulation
var simulation,
stepper = d3Timer.timer(step),
event = d3Dispatch.dispatch('tick', 'end');
//end: internals/simulation
//begin: algorithm conf.
var handleOverweightedVariant = 1; // this option still exists 'cause for further experiments
var handleOverweighted;
//end: algorithm conf.
//begin: utils
function sqr(d) {
return Math.pow(d, 2);
}
function squaredDistance(s0, s1) {
return sqr(s1.x - s0.x) + sqr(s1.y - s0.y);
}
//end: utils
///////////////////////
///////// API /////////
///////////////////////
simulation = {
tick: tick,
restart: function() {
stepper.restart(step);
return simulation;
},
stop: function() {
stepper.stop();
return simulation;
},
weight: function(_) {
if (!arguments.length) {
return weight;
}
weight = _;
shouldInitialize = true;
return simulation;
},
convergenceRatio: function(_) {
if (!arguments.length) {
return convergenceRatio;
}
convergenceRatio = _;
shouldInitialize = true;
return simulation;
},
maxIterationCount: function(_) {
if (!arguments.length) {
return maxIterationCount;
}
maxIterationCount = _;
return simulation;
},
minWeightRatio: function(_) {
if (!arguments.length) {
return minWeightRatio;
}
minWeightRatio = _;
shouldInitialize = true;
return simulation;
},
clip: function(_) {
if (!arguments.length) {
return weightedVoronoi.clip();
}
weightedVoronoi.clip(_);
shouldInitialize = true;
return simulation;
},
extent: function(_) {
if (!arguments.length) {
return weightedVoronoi.extent();
}
weightedVoronoi.extent(_);
shouldInitialize = true;
return simulation;
},
size: function(_) {
if (!arguments.length) {
return weightedVoronoi.size();
}
weightedVoronoi.size(_);
shouldInitialize = true;
return simulation;
},
prng: function(_) {
if (!arguments.length) {
return prng;
}
prng = _;
shouldInitialize = true;
return simulation;
},
initialPosition: function(_) {
if (!arguments.length) {
return initialPosition;
}
initialPosition = _;
shouldInitialize = true;
return simulation;
},
initialWeight: function(_) {
if (!arguments.length) {
return initialWeight;
}
initialWeight = _;
shouldInitialize = true;
return simulation;
},
state: function() {
return {
ended: ended,
iterationCount: iterationCount,
convergenceRatio: areaError / totalArea,
polygons: polygons
};
},
on: function(name, _) {
if (arguments.length === 1) {
return event.on(name);
}
event.on(name, _);
return simulation;
}
};
///////////////////////
/////// Private ///////
///////////////////////
//begin: simulation's main loop
function step() {
tick();
event.call('tick', simulation);
if (ended) {
stepper.stop();
event.call('end', simulation);
}
}
//end: simulation's main loop
//begin: algorithm used at each iteration
function tick() {
if (!ended) {
if (shouldInitialize) {
initializeSimulation();
}
polygons = adapt(polygons, flickeringMitigation.ratio());
iterationCount++;
areaError = computeAreaError(polygons);
flickeringMitigation.add(areaError);
converged = areaError < areaErrorTreshold;
ended = converged || iterationCount >= maxIterationCount;
// console.log("error %: "+Math.round(areaError*100*1000/totalArea)/1000);
}
}
//end: algorithm used at each iteration
function initializeSimulation() {
//begin: handle algorithm's variants
setHandleOverweighted();
//end: handle algorithm's variants
siteCount = data.length;
totalArea = Math.abs(d3Polygon.polygonArea(weightedVoronoi.clip()));
areaErrorTreshold = convergenceRatio * totalArea;
flickeringMitigation.clear().totalArea(totalArea);
iterationCount = 0;
converged = false;
polygons = initialize(data, simulation);
ended = false;
shouldInitialize = false;
}
function initialize(data, simulation) {
var maxWeight = data.reduce(function(max, d) {
return Math.max(max, weight(d));
}, -Infinity),
minAllowedWeight = maxWeight * minWeightRatio;
var weights, mapPoints;
//begin: extract weights
weights = data.map(function(d, i, arr) {
return {
index: i,
weight: Math.max(weight(d), minAllowedWeight),
initialPosition: initialPosition(d, i, arr, simulation),
initialWeight: initialWeight(d, i, arr, simulation),
originalData: d
};
});
//end: extract weights
// create map-related points
// (with targetedArea, initial position and initialWeight)
mapPoints = createMapPoints(weights, simulation);
return weightedVoronoi(mapPoints);
}
function createMapPoints(basePoints, simulation) {
var totalWeight = basePoints.reduce(function(acc, bp) {
return (acc += bp.weight);
}, 0);
var initialPosition;
return basePoints.map(function(bp, i, bps) {
initialPosition = bp.initialPosition;
if (!d3Polygon.polygonContains(weightedVoronoi.clip(), initialPosition)) {
initialPosition = DEFAULT_INITIAL_POSITION(bp, i, bps, simulation);
}
return {
index: bp.index,
targetedArea: (totalArea * bp.weight) / totalWeight,
data: bp,
x: initialPosition[0],
y: initialPosition[1],
weight: bp.initialWeight // ArlindNocaj/Voronoi-Treemap-Library uses an epsilonesque initial weight; using heavier initial weights allows faster weight adjustements, hence faster stabilization
};
});
}
function adapt(polygons, flickeringMitigationRatio) {
var adaptedMapPoints;
adaptPositions(polygons, flickeringMitigationRatio);
adaptedMapPoints = polygons.map(function(p) {
return p.site.originalObject;
});
polygons = weightedVoronoi(adaptedMapPoints);
if (polygons.length < siteCount) {
console.log('at least 1 site has no area, which is not supposed to arise');
debugger;
}
adaptWeights(polygons, flickeringMitigationRatio);
adaptedMapPoints = polygons.map(function(p) {
return p.site.originalObject;
});
polygons = weightedVoronoi(adaptedMapPoints);
if (polygons.length < siteCount) {
console.log('at least 1 site has no area, which is not supposed to arise');
debugger;
}
return polygons;
}
function adaptPositions(polygons, flickeringMitigationRatio) {
var newMapPoints = [],
flickeringInfluence = 0.5;
var flickeringMitigation, d, polygon, mapPoint, centroid, dx, dy;
flickeringMitigation = flickeringInfluence * flickeringMitigationRatio;
d = 1 - flickeringMitigation; // in [0.5, 1]
for (var i = 0; i < siteCount; i++) {
polygon = polygons[i];
mapPoint = polygon.site.originalObject;
centroid = d3Polygon.polygonCentroid(polygon);
dx = centroid[0] - mapPoint.x;
dy = centroid[1] - mapPoint.y;
//begin: handle excessive change;
dx *= d;
dy *= d;
//end: handle excessive change;
mapPoint.x += dx;
mapPoint.y += dy;
newMapPoints.push(mapPoint);
}
handleOverweighted(newMapPoints);
}
function adaptWeights(polygons, flickeringMitigationRatio) {
var newMapPoints = [],
flickeringInfluence = 0.1;
var flickeringMitigation, polygon, mapPoint, currentArea, adaptRatio, adaptedWeight;
flickeringMitigation = flickeringInfluence * flickeringMitigationRatio;
for (var i = 0; i < siteCount; i++) {
polygon = polygons[i];
mapPoint = polygon.site.originalObject;
currentArea = d3Polygon.polygonArea(polygon);
adaptRatio = mapPoint.targetedArea / currentArea;
//begin: handle excessive change;
adaptRatio = Math.max(adaptRatio, 1 - flickeringInfluence + flickeringMitigation); // in [(1-flickeringInfluence), 1]
adaptRatio = Math.min(adaptRatio, 1 + flickeringInfluence - flickeringMitigation); // in [1, (1+flickeringInfluence)]
//end: handle excessive change;
adaptedWeight = mapPoint.weight * adaptRatio;
adaptedWeight = Math.max(adaptedWeight, epsilon);
mapPoint.weight = adaptedWeight;
newMapPoints.push(mapPoint);
}
handleOverweighted(newMapPoints);
}
// heuristics: lower heavy weights
function handleOverweighted0(mapPoints) {
var fixCount = 0;
var fixApplied, tpi, tpj, weightest, lightest, sqrD, adaptedWeight;
do {
fixApplied = false;
for (var i = 0; i < siteCount; i++) {
tpi = mapPoints[i];
for (var j = i + 1; j < siteCount; j++) {
tpj = mapPoints[j];
if (tpi.weight > tpj.weight) {
weightest = tpi;
lightest = tpj;
} else {
weightest = tpj;
lightest = tpi;
}
sqrD = squaredDistance(tpi, tpj);
if (sqrD < weightest.weight - lightest.weight) {
// adaptedWeight = sqrD - epsilon; // as in ArlindNocaj/Voronoi-Treemap-Library
// adaptedWeight = sqrD + lightest.weight - epsilon; // works, but below heuristics performs better (less flickering)
adaptedWeight = sqrD + lightest.weight / 2;
adaptedWeight = Math.max(adaptedWeight, epsilon);
weightest.weight = adaptedWeight;
fixApplied = true;
fixCount++;
break;
}
}
if (fixApplied) {
break;
}
}
} while (fixApplied);
/*
if (fixCount>0) {
console.log("# fix: "+fixCount);
}
*/
}
// heuristics: increase light weights
function handleOverweighted1(mapPoints) {
var fixCount = 0;
var fixApplied, tpi, tpj, weightest, lightest, sqrD, overweight;
do {
fixApplied = false;
for (var i = 0; i < siteCount; i++) {
tpi = mapPoints[i];
for (var j = i + 1; j < siteCount; j++) {
tpj = mapPoints[j];
if (tpi.weight > tpj.weight) {
weightest = tpi;
lightest = tpj;
} else {
weightest = tpj;
lightest = tpi;
}
sqrD = squaredDistance(tpi, tpj);
if (sqrD < weightest.weight - lightest.weight) {
overweight = weightest.weight - lightest.weight - sqrD;
lightest.weight += overweight + epsilon;
fixApplied = true;
fixCount++;
break;
}
}
if (fixApplied) {
break;
}
}
} while (fixApplied);
/*
if (fixCount>0) {
console.log("# fix: "+fixCount);
}
*/
}
function computeAreaError(polygons) {
//convergence based on summation of all sites current areas
var areaErrorSum = 0;
var polygon, mapPoint, currentArea;
for (var i = 0; i < siteCount; i++) {
polygon = polygons[i];
mapPoint = polygon.site.originalObject;
currentArea = d3Polygon.polygonArea(polygon);
areaErrorSum += Math.abs(mapPoint.targetedArea - currentArea);
}
return areaErrorSum;
}
function setHandleOverweighted() {
switch (handleOverweightedVariant) {
case 0:
handleOverweighted = handleOverweighted0;
break;
case 1:
handleOverweighted = handleOverweighted1;
break;
default:
console.log("Variant of 'handleOverweighted' is unknown");
}
}
initializeSimulation();
return simulation;
}
exports.voronoiMapSimulation = voronoiMapSimulation;
exports.voronoiMapInitialPositionRandom = randomInitialPosition;
exports.voronoiMapInitialPositionPie = pie;
Object.defineProperty(exports, '__esModule', { value: true });
}));
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<title>Voronoï playground: animating the addition/removing of data on a Voronoï map</title>
<meta name="description" content="Transitioning from one Voronoï map to another with some addition and removal of data, using D3.js + d3-voronoiMap plugin + d3-weighted-voronoi plugin">
<script src="https://d3js.org/d3.v6.min.js" charset="utf-8"></script>
<script src="https://rawcdn.githack.com/Kcnarf/d3-weighted-voronoi/v1.0.1/build/d3-weighted-voronoi.js"></script>
<script src="https://rawcdn.githack.com/Kcnarf/d3-voronoi-map/v2.0.1/build/d3-voronoi-map.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/seedrandom/2.4.3/seedrandom.min.js"></script>
<style>
#wip {
display: none;
position: absolute;
top: 200px;
left: 330px;
font-size: 40px;
text-align: center;
}
.control {
position: absolute;
}
.control#control-0 {
left: 5px;
top: 5px;
text-align: center;
}
.control#control-1 {
left: 5px;
bottom: 5px;
}
.control span {
width: 100px;
}
.control input[type="range"] {
width: 210px;
}
svg {
position: absolute;
top: 25px;
left: 15px;
margin: 1px;
border-radius: 1000px;
box-shadow: 2px 2px 6px grey;
}
.seed {
fill: steelblue;
}
.seed.group-enter {
fill: lightgreen;
}
.seed.group-exit {
fill: pink;
}
.seed.hide {
display: none;
}
.cell {
fill-opacity: 0.1;
fill: lightsteelBlue;
stroke: lightsteelBlue;
}
.cell.group-enter {
fill: lightgreen;
stroke: lightgreen;
}
.cell.group-exit {
fill: pink;
stroke: pink;
}
</style>
</head>
<body>
<svg>
<g id="drawing-area">
<g id="cell-container"></g>
<g id="site-container"></g>
</g>
</svg>
<div id="control-0" class="control">
<span>Starting Voronoï map</span>
<input id="weight" type="range" name="points" min="0" max="1" value="0" step="0.01" oninput="interpolationValueUpdated()">
<span>Ending Voronoï map</span>
</div>
<div id="control-1" class="control">
<input id="weight" type="checkbox" name="showSites" onchange="siteVisibilityUpdated()">
<span>Show internals</span>
</div>
<div id="wip">
Work in progress ...
</div>
</body>
<script>
// uncomment below line to have repeatable results
Math.seedrandom('seed1');
var WITH_TRANSITION = true;
var WITHOUT_TRANSITION = false;
var duration = 250;
var _2PI = 2*Math.PI;
var cos = Math.cos;
var sin = Math.sin;
var sqrt = Math.sqrt;
var random = Math.random;
var floor = Math.floor;
//begin: layout conf.
var totalHeight = 500,
controlsHeight = 30,
svgRadius = (totalHeight-controlsHeight)/2,
svgbw = 1, //svg border width
svgHeight = 2*svgRadius,
svgWidth = 2*svgRadius,
radius = svgRadius-svgbw,
width = 2*svgRadius,
height = 2*svgRadius,
halfRadius = radius/2
halfWidth = halfRadius,
halfHeight = halfRadius,
quarterRadius = radius/4;
quarterWidth = quarterRadius,
quarterHeight = quarterRadius
siteOpacity = 0;
//end: layout conf.
//begin: data definition
var baseDataCount = 12,
bigDataCount = 3,
exitingDataCount = 2,
enteringDataCount = 2;
var baseValue = 10,
bigValue = 5*baseValue;
var startingData = [], // store data for the starting Voronoï map
endingData = []; // store data for the ending Voronoï map
var startingValue, endingValue;
//create the starting data set and the ending data set
for (i=0; i<baseDataCount; i++) {
if (i < bigDataCount) {
startingValue = bigValue;
endingValue = bigValue;
} else {
startingValue = (0.5+random())*baseValue;
endingValue = (0.5+random())*baseValue;
}
startingData.push({
index: i,
value: startingValue
});
endingData.push({
index: i,
value: endingValue
})
}
//add new data to the ending data set
for (i=baseDataCount; i<baseDataCount+enteringDataCount; i++) {
endingValue = (0.5+random())*baseValue;
endingData.push({
index: i,
value: endingValue
})
}
//delete data from the ending data set
endingData = endingData.slice(exitingDataCount);
//end: data definition
//begin: utilities
var key = (d)=>d.index; //mapping between starting and ending data
var X_ACCESSOR = (d) => d.interpolatedX; // x accessor of interpolated data
var Y_ACCESSOR = (d) => d.interpolatedY; // y accessor of interpolated data
var WEIGHT_ACCESSOR = (d) => d.interpolatedWeight; // weight accessor of interpolated data
var cellLiner = d3.line()
.x(function(d){ return d[0]; })
.y(function(d){ return d[1]; });
var siteRadiusScale = d3.scaleSqrt()
.domain([0, bigValue])
.range([0,10]);
//end: utilities
//begin: reusable d3-selections
var svg = d3.select("svg"),
drawingArea = d3.select("#drawing-area"),
cellContainer = d3.select("#cell-container"),
siteContainer = d3.select("#site-container");
//end: reusable d3-selections
initLayout();
//begin: user interaction handlers
function interpolationValueUpdated() {
var interpolationValue = +d3.select("#control-0 input").node().value;
var interpolationEasing = d3.easeLinear;
// just for fun, choose the easing effect by uncommenting the adequate loc
// interpolationEasing = d3.easeSinInOut;
// interpolationEasing = d3.easeElasticInOut;
//interpolationEasing = d3.easeBounceInOut;
interpolationValue = interpolationEasing(interpolationValue);
interpolatedCells = voronoiMapInterpolator(interpolationValue);
interpolatedSites = interpolatedCells.map(function(c) {return c.site.originalObject; });
redrawCells(WITHOUT_TRANSITION);
redrawSites(WITHOUT_TRANSITION);
}
function siteVisibilityUpdated() {
siteOpacity = d3.select("#control-1 input").node().checked? 1:0;
redrawSites();
}
//end: user interaction handlers
//begin: voronoi stuff definitions
var clippingPolygon = computeClippingPolygon();
var startingSimulation = computeVoronoiPolygons(startingData, null);
var startingPolygons = startingSimulation.state().polygons;
var endingSimulation = computeVoronoiPolygons(endingData, startingPolygons)
var endingPolygons = endingSimulation.state().polygons;
var voronoiMapInterpolator = buildVoronoiMapInterpolator();
var weightedVoronoi = d3.weightedVoronoi()
.x(X_ACCESSOR)
.y(Y_ACCESSOR)
.weight(WEIGHT_ACCESSOR)
.clip(clippingPolygon);
var interpolatedSites = []; // store interpolated sites
var interpolatedCells = []; // store cells
//end: voronoi stuff definitions
interpolatedCells = voronoiMapInterpolator(0);
interpolatedSites = interpolatedCells.map(function(c) {return c.site.originalObject; });
redrawCells(WITHOUT_TRANSITION);
redrawSites(WITHOUT_TRANSITION);
/***************/
/* Computation */
/***************/
// returns a function that is the interpolator, taking starting and ending tessellations as inputs
function buildVoronoiMapInterpolator() {
var startingSites = startingPolygons.map((p)=>p.site),
endingSites = endingPolygons.map((p)=>p.site);
var startingSiteByKey = {},
endingSiteByKey = {},
allSiteKeys = new Set();
var k;
startingSites.forEach((s)=>{
k = key(s.originalObject.data.originalData);
startingSiteByKey[k]=s;
allSiteKeys.add(k)
});
endingSites.forEach((s)=>{
k = key(s.originalObject.data.originalData);
endingSiteByKey[k]=s;
allSiteKeys.add(k);
});
var siteTweeningData = [];
var startingSite, startingData, startingX, startingY, startingWeight, startingValue,
endingSite, endingData, endingX, endingY, endingWeight, endingValue,
tweenType;
//find correspondance between starting and ending cells/sites/data; handle entering and exiting cells
allSiteKeys.forEach((k)=>{
startingSite = startingSiteByKey[k];
endingSite = endingSiteByKey[k];
if (startingSite && endingSite) {
// a startingSite and an endingSite corresponding to the same datum
startingData = startingSite.originalObject.data.originalData;
endingData = endingSite.originalObject.data.originalData;
startingX = startingSite.x;
endingX = endingSite.x;
startingY = startingSite.y;
endingY = endingSite.y;
startingWeight = startingSite.weight;
endingWeight = endingSite.weight;
startingValue = startingData.value;
endingValue = endingData.value;
tweenType = 'update';
} else if (endingSite) {
// no startingSite, i.e. datum not in starting sites
// no coords interpolation (site fixed to ending position), weight interpolated FROM underweighted weight, and value interpolated FROM 0
startingData = null;
endingData = endingSite.originalObject.data.originalData;
startingX = endingSite.x;
endingX = endingSite.x;
startingY = endingSite.y;
endingY = endingSite.y;
startingWeight = computeUnderweight(endingSite, startingPolygons);
endingWeight = endingSite.weight;
startingValue = 0;
endingValue = endingData.value;
tweenType = 'enter';
} else {
//no endingSite, i.e. datum not in ending sites
//no coords interpolation (site fixed to starting position), weight interpolated TO underweighted weight, and data interpolated TO 0
startingData = startingSite.originalObject.data.originalData;
endingData = null;
startingX = startingSite.x;
endingX = startingSite.x;
startingY = startingSite.y;
endingY = startingSite.y;
startingWeight = startingSite.weight;
endingWeight = computeUnderweight(startingSite, endingPolygons);
startingValue = startingData.value;
endingValue = 0;
tweenType = 'exit';
}
siteTweeningData.push({
startingData: startingData,
endingData: endingData,
key: k,
startingX: startingX,
endingX: endingX,
startingY: startingY,
endingY: endingY,
startingWeight: startingWeight,
endingWeight: endingWeight,
startingValue: startingValue,
endingValue: endingValue,
tweenType: tweenType,
})
});
// Produces a Voronoï tessellation inbetween a starting tessellation and an ending tessellation.
// Currently uses a LERP interpollation. Param 'interpolationValue' gives the interpolation amount: 0->starting tessellation, 1->ending tessellation
return function voronoiTesselationInterpolator(interpolationValue) {
var iv = interpolationValue; // just a smaller identifer
// [STEP 1] interpolate each coords and weights
interpolatedSites = siteTweeningData.map((std)=>{
return {
key: std.key,
interpolatedX: lerp(std.startingX, std.endingX, iv),
interpolatedY: lerp(std.startingY, std.endingY, iv),
interpolatedWeight: lerp(std.startingWeight, std.endingWeight, iv),
interpolatedValue: lerp(std.startingValue, std.endingValue, iv),
tweenType: std.tweenType
};
});
// [STEP 2] use d3-weighted-voronoi to compute the interpolated tessellation
return weightedVoronoi(interpolatedSites);
}
}
// linear interpolation between a starting value and an ending value
function lerp(startingValue, endingValue, interpolationValue) {
var iv = interpolationValue, // just a smaller identifer
sv = startingValue,
ev = endingValue;
return (1-iv)*sv + iv*ev;
}
// when interpolating, all sites/data (entering, updated, exiting) are mapped to an interpolated site in order to produce an interpolated Voronoï tesselation; when interpolation value is 0, we want the interpolated tessellation looks like the starting tessellation (even with the added entering sites/data), and don't want the entering sites/data to produce a cell; in the same way, when interpolation value is 1, we want the interpolated tessellation looks like the ending tessellation (even with the exiting sites/data), and don't want the exiting sites/data to produce a cell
// using a default value such (as 0) doesn't insure this desired behavior (a site with weight 0 can produce a cell)
// using a very low default value (as -1000) will do the trick for first/starting and last/ending tessellation, BUT the interpolated weights during animation of tessellations may be weird because entering and exiting sites/data appear/disappear too quickly
// so the below function
// returns an underweighted weight so that the entering (or exiting) site/data is completly overweighted by the starting sites (or ending sites)
// algo:
// [STEP 1] find the starting cell where the entering/exiting site/data comes in/out
// [STEP 2] compute the underweighted weight (depending on farest vertex from polygon's site and polygon's site's weight)
function computeUnderweight(site, polygons) {
var polygon = null;
// [STEP 1] find the starting cell where the entering site/data comes in
polygons.forEach(p => {
if (!polygon) {
if (d3.polygonContains(p, [site.x, site.y])) {
polygon = p;
}
}
})
// [STEP 2] compute the overweighted weight (depending on farest vertex from polygon's site and polygon's site's weight)
var pSite = polygon.site,
squaredFarestDistance = -Infinity;
var squaredD;
polygon.forEach(v=> {
squaredD = (pSite.x-v[0])**2 + (pSite.y-v[1])**2;
if (squaredD > squaredFarestDistance) {
squaredFarestDistance = squaredD;
}
})
var underweight = - squaredFarestDistance + pSite.weight ;
return underweight;
}
//uses d3-voronoi-map to compute a Voronoï map where each cell's area encodes a particular datum's value.
//Param 'previousPolygons' allows to reuse coords and weights of a previously computed Voronoï tessellation, in order for updated data to produce cells in the same region.
function computeVoronoiPolygons(data, previousPolygons) {
var simulation, k;
if (previousPolygons) {
var previousSites = previousPolygons.map(d=>d.site),
previousSiteByKey = {},
previousTotalWeight = 0;
previousSites.forEach((s)=>{
k = key(s.originalObject.data.originalData);
previousSiteByKey[k]=s;
previousTotalWeight += s.weight;
});
var previousAverageWeight = previousTotalWeight/previousSites.length;
var intialPositioner = function(d) {
var previousSite = previousSiteByKey[key(d)];
if (previousSite) {
return [previousSite.x, previousSite.y];
} else {
//return nearClippingCirclePerimeter();
return nearAnyPreviousPartitioningVertex(previousPolygons);
}
}
var intialWeighter = function(d) {
var previousSite = previousSiteByKey[key(d)];
if (previousSite) {
return previousSite.weight;
} else {
return previousAverageWeight;
}
}
simulation = d3.voronoiMapSimulation(data)
.clip(clippingPolygon)
.weight((d)=>d.value)
.initialPosition(intialPositioner)
.initialWeight(intialWeighter)
.stop();
} else {
simulation = d3.voronoiMapSimulation(data)
.clip(clippingPolygon)
.weight((d)=>d.value)
.stop();
}
var state = simulation.state(); // retrieve the simulation's state, i.e. {ended, polygons, iterationCount, convergenceRatio}
//begin: manually launch each iteration until the simulation ends
while (!state.ended) {
simulation.tick();
state = simulation.state();
}
//end:manually launch each iteration until the simulation ends
return simulation;
}
// return a position near circle's perimeter, at random angle
// /!\ DON'T USE IT, not that good policy, because the returned coords can be inside a big cell (i.e. inside a cell corresponding to a big value to encode), so that the resulting site may be overweighted, which lead to not producing any cell (error 'Cannot read property 'site' of undefined' in 'adpatPositions' function of d3-vornoi-map)
// prefer next policy
function nearClippingCirclePerimeter() {
var angle = _2PI*random(),
d = (radius-10); // -10 ensure the new point is inside the clipping circle
var x = radius + d*cos(angle),
y = radius + d*sin(angle);
var xRandomness = (random()-0.5), // -0.5 for a central distribution
yRandomness = (random()-0.5);
var coords = [x + xRandomness, y + yRandomness]; // raise an error without randomness, don't understand why ...
// begin: debug: display position of added sites (i.e. added data)
// siteContainer.append('circle').attr('r', 3).attr('cx', coords[1]).attr('cy', coords[1]).attr('fill', 'red');
// end: debug
return coords;
}
// return a position corresponding to a vertex separating 2 cells (not a vertex of a border cell due to the clipping polygon)
function nearAnyPreviousPartitioningVertex(previousPolygons) {
var vertexNearClippingPolygon = true;
var i, previouscell, previousVertex;
// begin: redo until choosen vertex is not one of the clipping polygon
while (vertexNearClippingPolygon) {
// pick a random previous cell
i = floor(previousPolygons.length*random());
previouscell = previousPolygons[i];
// pick a random vertex
i = floor(previouscell.length*random());
previousVertex = previouscell[i];
vertexNearClippingPolygon = nearAClippingPolygonVertex(previousVertex);
}
// end: redo until choosen vertex is not one of the clipping polygon
// add some randomness if the choosen vertex is picked several times due to several addition of data, checking that the coords are still in the clipping polygon
var coordsInClippingPolygon = false;
var xRandomness, yRandomness, coords;
while (!coordsInClippingPolygon) {
xRandomness = random()-0.5; // -0.5 for a central distribution
yRandomness = random()-0.5;
coords = [previousVertex[0]+xRandomness, previousVertex[1]+yRandomness];
coordsInClippingPolygon = d3.polygonContains(clippingPolygon, coords);
}
// begin: debug: display position of added sites (i.e. added data)
// siteContainer.append('circle').attr('r', 3).attr('cx', coords[0]).attr('cy', coords[1]).attr('fill', 'red');
// end: debug
return coords;
}
function nearAClippingPolygonVertex (v) {
var near = 1;
var dx, dy, d;
var isVertexOfClippingPolygon = false;
clippingPolygon.forEach(cv=>{
if (!isVertexOfClippingPolygon) {
dx = v[0] - cv[0];
dy = v[1] - cv[1];
d = sqrt(dx**2+dy**2);
isVertexOfClippingPolygon = d<near;
}
})
return isVertexOfClippingPolygon;
}
function computeClippingPolygon() {
var circlingPolygon = [];
for (a=0; a<_2PI; a+=_2PI/60) {
circlingPolygon.push(
[radius + (radius)*cos(a), radius + (radius)*sin(a)]
)
}
return circlingPolygon;
};
/***********/
/* Drawing */
/***********/
function initLayout () {
svg.attr("width", svgWidth)
.attr("height", svgHeight);
drawingArea.attr("width", width)
.attr("height", height)
.attr("transform", "translate("+[svgbw, svgbw]+")");
}
function redrawSites() {
var siteSelection = siteContainer.selectAll(".seed")
.data(interpolatedSites, function(s){ return s.key; });
siteSelection
.enter()
.append("circle")
.attr("class", function(d){ return "group-"+d.tweenType; })
.classed("seed", true)
.merge(siteSelection)
.attr("r", (d)=> siteRadiusScale(d.interpolatedValue))
.attr("opacity", siteOpacity)
.attr("transform", (d)=>{ return "translate("+[d.interpolatedX,d.interpolatedY]+")"; });
siteSelection.exit().remove();
}
function redrawCells(withTransition) {
var cellSelection = cellContainer.selectAll(".cell")
.data(interpolatedCells, function(c){ return c.site.originalObject.key; });
cellSelection.enter()
.append("path")
.attr("class", function(d){ return "group-"+d.site.originalObject.tweenType; })
.classed("cell", true)
.attr("id", function(d,i){ return "cell-"+d.site.originalObject.key; })
.merge(cellSelection)
.transition()
.duration(withTransition? duration : 0)
.attr("d", function(d){ return cellLiner(d)+"z"; });
cellSelection.exit().remove();
}
</script>
</html>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment