Excellent framing on transparency vs black-box generation. The distinction between generating outputs vs generating behavior is key. I built similar rule-based systems for visual art using p5.js and the editability factor changes everything since the artifact becomes a living system not a frozen asset. Only downside is explaining to stakeholders why the "AI" isn' t doing the work when really the AI just shifted from generator to coauthor.
Once you frame the artefact as a "living system" rather than a frozen output, the value shifts from "what did the AI make?" to "what kind of machine did we design?"
For stakeholders, I've found it helps to say the AI moved "upstream": it's no longer the performer, it's the "instrument builder" and "thinking partner". The work isn't less AI-driven, it's just more legible, editable, and durable. That usually clicks.
We built hundreds of creative sketches (arts, science, games,...) with p5.js, find them here:
Excellent framing on transparency vs black-box generation. The distinction between generating outputs vs generating behavior is key. I built similar rule-based systems for visual art using p5.js and the editability factor changes everything since the artifact becomes a living system not a frozen asset. Only downside is explaining to stakeholders why the "AI" isn' t doing the work when really the AI just shifted from generator to coauthor.
Love that point—and totally agree. 🙌
Once you frame the artefact as a "living system" rather than a frozen output, the value shifts from "what did the AI make?" to "what kind of machine did we design?"
For stakeholders, I've found it helps to say the AI moved "upstream": it's no longer the performer, it's the "instrument builder" and "thinking partner". The work isn't less AI-driven, it's just more legible, editable, and durable. That usually clicks.
We built hundreds of creative sketches (arts, science, games,...) with p5.js, find them here:
https://buildingcreativemachines.substack.com/p/explore-and-play