//=time() ?>
KLMC2's step size and friction terms both doing their own independent random walks, while cycling two pairs of style prompts at different rates
while cycling style prompts, decided to let a truncated random walk control KLMC2's friction term (if KLMC2 explores by rolling a ball around the latent, this randomly incr/decr the ball's stickiness while it rolls). for sure gonna have to play with this more
Just one prompt here and a bunch of scalar settings. No keyframes, prompt interpolation, etc. Just a sampler taking a meandering stroll through the neighborhood of a fixed prompt with a weak conditioning weight
got ChatGPT to generate an image prompt based on a "dreamed" ayahuasco hallucination experienced by a synesthete. Process described in thread 1/n