A hobbyist project exploring generative art while learning Rust and nannou. Stuff like this:
You can see more screenshots here on github by looking at the auto generated markdown index or checkout audio-visual compositions on Instagram.
Whether you're curious about generative art, interested in audio-visual performance, or just learning Rust like I am, this project might be worth exploring. Originally started as a port of my p5.js project, it's grown into a surprisingly capable framework that handles the tedious parts of creative coding - like DAW synchronization, hot-reloading configurations, and multi-channel audio processing. While I'm still learning Rust best practices, the project offers some useful features for anyone wanting to experiment with algorithmic art, especially if you're interested in synchronizing visuals with music. It's set up to work with MIDI controllers and clock, OSC, audio input, and even shader programming, making it a fun playground for creative coding experiments.
- Export images and capture mp4 videos with the press of a button
- Declarative animation interface with times specified in musical beats, e.g.
3.25
represents a duration of 3 beats and 1 16th note;4
means 4 beats or 1 bar. - Sync animations to BPM and frame count, MIDI clock, MIDI Time Code, or OSC
- Automate parameters with MIDI CC, OSC, CV, or audio with peak, rms, and multiband mechanisms all available through a dead simple API
- Write animations in code or configure your sketch to use an external yaml file that can be hot-reloaded (similar to live coding - see Control Scripting)
- Declarative per-sketch UI control definitions with framework agnostic design
- Automatic store/recall of per-sketch GUI controls/parameters that can be source controlled
- Hot reloadable WGSL shaders with various templates to simplify setup
- Store snapshots of all GUI, MIDI, and OSC controls by pressing
Shift+Number
to save andCmd+Number
to recall. Snapshots are interpolated to/from at a configurable musical length from 1/16th note up to 4bars. Great for live performance!
To help myself debug I made an on-screen "virtual midi controller" app. This should really be its own project but since I had all of my own MIDI handling code along with nannou and egui at my fingertips, why not?
Run with
cargo run --release --bin virtual_midi_controller
TODO: hi-res MIDI
This project has been developed on MacOS. I have no idea how it would run on other platforms. The project requires or optionally needs:
- Rust
- (optional) just for running commands
- (optional) ffmpeg available on your path for video exports
cargo run --release -- <sketch>
# or alternatively
just start <sketch>
Where sketch
is a file in the src/sketches folder (without the extension) and
registered in src/sketches/mod.rs as well as
[src/runtime/app.rs][app].
Optionally you can pass a timing
argument after the required sketch
argument
to specify what kind of timing system will be used to run animations on sketches
that support it (this is a new feature so not all sketches are using this
run-time Timing
mechanism yet). Available options include:
Uses Lattice's internal frame system. This is the default and doesn't require any external devices to run.
Requires assets/L.OscTransport.amxd to be running in Ableton Live. This provides the most reliable syncing mechanism as Ableton does not properly send MIDI SPP messages and doesn't support MTC. See the OSC section for more details.
Uses MIDI clock and MIDI Song Position Pointers (SPP) to stay in sync (e.g. when a MIDI source loops or you jump to somewhere else in a timeline, your animations will jump or loop accordingly). Bitwig properly sends SPP; Ableton does not.
Uses a combination of MIDI clock (for precision) and MIDI Time Code (MTC) to stay in sync. This is useful for DAWs that don't support sending SPP but do support MTC. Ableton, for example, does not support MTC but you can work around that with https://support.showsync.com/sync-tools/livemtc/introduction
- Copy the template sketch into a new file in sketches folder.
- Rename at a minimum the
SKETCH_CONFIG.name
field at the top of the file:pub const SKETCH_CONFIG: SketchConfig = SketchConfig { name: "template", // <-- RENAME THIS!
- Add that filename to the src/sketches/mod.rs
- Add that sketch module to the
register_sketches
call in [src/runtime/app.rs][app]: - Run that sketch via command line by
cargo run --release <name>
orjust start <name>
wherename
is what you put in your file'sSKETCH_CONFIG.name
field.
The Audio struct in Lattice is hardcoded to read audio from the first input
(index 0) on a device named "Lattice" (this can be changed by editing the
AUDIO_DEVICE_NAME
constant in src/config.rs). I am currently doing
this via Aggregate Device on my Mac using Blackhole 2ch to capture
output from my DAW. Here are some screenshots of the setup:
Note that Blackhole automatically routes whatever its output channels are to its own input, so sending audio out to Blackhole 3/4 will automatically appear on inputs 1/2 in this setup; you don't even need to configure the inputs in Ableton at all for this to work (just as long as you have the output config set to "Lattice" and enable the appropriate ouputs in the output config under Live's audio preferences)
See audio_dev.rs for an example sketch.
Similar to the above, only treats each audio channel as an individual control signal with optional slew limiting, suitable for audio-rate or control-rate signals. Lattice is configured to use an audio device named "Lattice16". On my computer I'm using the 16 channel version of Blackhole.
In the above setup I use 1-2 as the main outs and send the multichannel data out to channels 2-18 which then appear on Blackhole channels 1-16
See audio_controls_dev.rs or cv_dev.rs for an example that uses CV.
Lattice is hardcoded to accept MIDI on a device named IAC Driver Lattice In
.
You can change this by editing the MIDI_INPUT_PORT
constant in
src/config.rs.
To automate synth parameters in Ableton and Lattice parameters simultaneously
from the same UI CC control in Live (as opposed to a physical control, in
which case you can skip this section), you need to enable MIDI loopback by
sending MIDI to Lattice In
and also route Lattice In
back in to Live to
control parameters. Here's the routing:
To use Ableton automation lanes to control Lattice params, follow these steps:
- Create a MIDI track and clip and add CC automation to it.
- In the tracks MIDI To router, select
IAC Driver Lattice In
andCh. 1
Those steps are all you need to send MIDI to Lattice to control parameters. As for controlling a live parameter with that same CC, follow these steps:
- Play your clip containing the CC data
- Stop the transport (this is important!)
- Enter MIDI Mapping mode.
- Locate the parameter to you want to map and select it (make sure it's the last thing you've clicked)
- Press the Space bar to start the transport. This should do it!
See the midi_test.rs sketch for an example of how to map a control to something.
Note: the above instructions are for working without a MIDI controller. When working with a MIDI controller you can just map the MIDI control to an Ableton device knob that can send CC out to Lattice and also map the controller to an Ableton parameter. In this case you do not want Lattice enabled in Ableton's MIDI Input ports at all as that just complicates things.
With MIDI ports configured in your DAW to send clock to Lattice, Lattice is already in a place where you can perfectly sync video recordings with audio from your DAW. Below are steps to setup Ableton Live such that you can record audio and video simultaneously when you press Play in the DAW (if you only want to record video you can just do steps 2 and 4):
- In Ableton > Preferences > Record, make sure Start Transport With Record is set to Off
- Hit Q Rec in Lattice.
- Arm tracks in Ableton, arm the transport (Record button)
- Now, pressing play in Ableton will also initiate recording in Lattice, likewise pressing Stop in Ableton will stop recording in Lattice.
While MIDI is great for controlling parameters in the case that a MIDI controller can send 14bit high resolution MIDI, it sucks otherwise (128 values just isn't enough precision for smooth parameter automation). For this reason Lattice supports OSC and comes with two MaxForLive devices designed to make integration with Ableton Live simpler.
Place this on any track in Ableton and it will send high precision clock and exact transport location to Lattice. This should be preferred over using MIDI Timing however you should still make sure MIDI ports between Ableton and Lattice are configured properly as Lattice still depends on MIDI clock for starting, stopping, and syncing video recordings. The default host and port align with what Lattice expects and can be left alone, though you can configure this in src/config.rs.
A super basic OSC value sender. While there are much fancier MaxForLive devices that can send OSC, the "official" OSC Send device that comes with Ableton's Connection Kit does not send high resolution data, which defeats the entire purpose!
Lattice provides various interfaces for controlling parameters including
Controls
for UI (sliders, checkboxes, and selects), MidiControls
and
OscControls
for controlling parameters from an external source,
AudioControls
for controlling parameters with audio or CV, and a comprehensive
Animation
module that can tween or generate random values and ramp to/from
them at musical intervals. While these parameters are simple to setup, it's a
bit of pain to have to restart the rust sketch every time you want to change an
animation or control configuration. For this reason Lattice provides a
ControlScript
mechanism that uses yaml for configuration and adds these
controls dynamically and self-updates at runtime when the yaml file is changed.
You still have to take care to setup the routings in your sketch (e.g.
let radius = model.controls.get("radius")
), but once these routings are in
place you are free to edit their ranges, values, timing, etc. See Control
Script Test for a working example or
docs/control_script_reference.md for
comprehensive documentation.
- https://sotrh.github.io/learn-wgpu
- https://inconvergent.net/generative/
- http://www.complexification.net/
- https://n-e-r-v-o-u-s.com/projects/albums/floraform-system/
- https://www.andylomas.com/cellularFormImages.html
- http://www.complexification.net/gallery/machines/sandstroke/
- https://thebookofshaders.com/
- https://github.com/jasonwebb/2d-space-colonization-experiments
- https://paulbourke.net/geometry/
- https://easings.net/