You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Two prompt specs for an interactive React app that turns images into music: one is a detailed engineering spec, the other is a concise, user-level feature brief.
Build an Audio–Visual Sonification App (replicate this spec exactly)
Create a single React component (functional, hooks) named SonificationApp that I can import and render. It must match the behavior and structure below precisely.
Tech + Structure
Use React with hooks (useState, useEffect, useRef, useMemo).
Export default the component.
Use Tailwind-style utility classes for styling exactly as referenced below (no CSS files).
No external state managers or audio libs; use Web Audio API directly.
Component layout: Header, Main (Visualizer + Control Panel), Footer.
UI Layout (must match)
Header: title “Audio–Visual Sonification”, subtle subtitle “Blocks scan • Rich synthesis”, and 3 controls:
Play/Stop button.
Upload Image<input type="file" accept="image/*">.
Show/Hide Diagnostics toggle button.
Main area → grid with two sections:
Visualizer card: waveform canvas (top) + Image Map (bottom) with a Height slider that changes the image-map height in pixels.
Control Panel card: controls in grouped sections (Core, Tone, Modulation, Filter & Reverb, Blocks) plus a Presets dropdown at the top.
Footer: two short helper lines (“Upload an image…”, “Tip: Larger blocks…”).
Visualizer
Waveform canvas rendering from an AnalyserNode time-domain buffer (line plot).
Background gradient (#0f172a → #111827), thin grid lines every 24px.
Runs on requestAnimationFrame.
Image Map canvas directly below: draws the processed (downsampled) image aspect-fit with image smoothing disabled, plus a cyan stroke rectangle indicating the current block (step) being sonified.
The marker MUST draw even when paused (based on the last stepRef index).
The Image Map height slider (120–520px) must update the backing store and repaint immediately even when not playing.
Image Handling & Block Grid
On upload, load image into an offscreen canvas and downsample preserving aspect to a working size max 256×256.
Read pixels via getImageData(0,0,W,H) (no external libs).
Build a row-major grid of blocks using two sliders:
Block Width (px): 1–32
Block Height (px): 1–32
For each block, compute average RGB across its pixels, then derive:
h from HSV (0–360) for scale degree mapping.
l from HSL (0–100) for brightness-driven variations.
Wet/Dry paths with a ConvolverNode for reverb (no IR fetch; build via generated impulse response: stereo noise with exponential decay; duration and decay tied to reverbTime).
Bass boost via lowshelf at 200 Hz (dB from slider).
Live parameter updates during playback (no restart) using a params ref pattern to avoid stale closures.
Sequencer timing:
Steps per beat = speed (0.25–8).
Recompute interval timing immediately if BPM or speed changes while playing.
Compute step ms = 60000 / bpm / clamp(speed, 0.25, 8), with a floor (≥20ms).
Note Scheduling (per step)
Map from block:
Use scaleIntervals from the selected Scale and the calculated degree and oct.
Base note is Hz slider; convert with freqToMidi/midiToFreq helpers to add intervals.
Brightness (from HSL l) drives both velocity and filter cutoff tracking.
Real-time: Changing ANY control updates the sound immediately on the next scheduled step; BPM/Speed changes must retime the interval instantly.
drawImageMap(): must exist and be called on:
After processing an image
On window resize
On image map height changes
During visualizer loop
When paused and grid/scale changes
When paused, changing Block Width/Height or Scale must repaint the Image Map and keep the marker index stable (no forced reset).
Diagnostics panel with simple runtime checks (e.g., midiToFreq(69) ≈ 440, existence of convolver/analyser/wet-dry, blocks computed, drawImageMap defined).
Implementation Notes
Use helper functions: midiToFreq, freqToMidi, clamp, rgbToHsv, rgbToHsl.
Store image pixel buffer + dimensions in refs; compute blocks into a blocksRef array and keep colsRef/rowsRef.
Sequencer step index: stepRef, wraps around blocksRef.length (or 64 fallback when no image).
Use a ref-backed params object (paramsRef.current = {...} in an effect) to prevent stale closures.
On teardown (Stop): clear requestAnimationFrame and the step interval.
Deliver only the React component source (ES module with import React, { ... } from "react" and export default function SonificationApp() { ... }). Do not include any build config, HTML shell, or external assets.