Pulsating Mirror

Idea:

For this week’s Computational Media assignment I iterated through multiple ideas before deciding to create a mirror that pulsates and changes color with the bass from a song. I wanted to combine both video and sound elements into this project and thought that it would be interesting to see how the live video would respond to sound. I had watched more than a few Coding Train videos about video and sound and began working off of the projects created in the videos.

A lot of the inspiration for the program came from Daniel Shiffman and his coding train videos. The main part being the use of pixel arrays, using vScale, and mapping the pixels. Initially I had planned on using the audio input function of p5’s sound library but decided to switch it to a preloaded sound. Below is a short video of the program.

After doing this initial program I feel like this would work well modified for a live music input and being projected in a club setting. As for an initial piece I think it was a success and would like to expand on it to see what other visually appealing modifications I can make to it.

Click here to try it for yourself(turn your sound on!).

And click here to the full song.

Code:

/*
ICM Homework 10/29/18, Morgan Mueller

This project takes a live video and performs image manipulations 
on it to give a few different results depending on the music being
played.

The first result is that a grayscaled video appears with minorly 
pulsating rectangles. The second is that a colored video appears
with pulsating circles.

The slider at the bottom of the screen increases the size of the 
pixels in the video

*/
let video;

let vScaleSlider1;
let vScaleSlider2;
let vScale = 16;

//load sound info
// FYI for ICM class I called this boring because the DJ's name 
//is DJ Boring
let boring;
let fft;

let slideText;

function preload() {

  // import the song
  boring = loadSound('assets/icmFinal.mp3');

}

function setup() {

  createCanvas(640, 480);
  pixelDensity(1);

  //instantiate sliders
  
  vScaleSlider1 = createSlider(0, 50, 0);
  vScaleSlider1.position(10, 500);
  
  stroke(255);
  let tempText = createElement('p','Pulsation Intensity');
    tempText.position(150, 480);

  video = createCapture(VIDEO);
  video.size(width / vScale, height / vScale);
  video.hide();

  //begin the FFT operations
  fft = new p5.FFT();
  boring.amp(0.7);
  boring.play();

  frameRate(30);

}

function draw() {
  background(51);

  //load the video's pixels
  video.loadPixels();
  loadPixels();

  //analyze the sound 
  let fftSpectrum = fft.analyze();
    //get the energy from the bass
  let boringBass = fft.getEnergy("bass");
    //map the energy from the bass 
  let bassMapped = map(boringBass, 0, 255, 0, vScale + vScaleSlider1.value());
 // console.log(boringMid);

  //iterate through the video in both x and y 
  for (let y = 0; y < video.height; y++) {
    for (let x = 0; x < video.width; x++) {
      //calculation to index through the video's size 
      let index = (video.width - x - 1 + (y * video.width)) * 4;

      //create variables to store pixel values in the video
      let r = video.pixels[index + 0];
      let g = video.pixels[index + 1];
      let b = video.pixels[index + 2];
    
      //average grayscale value of the video
      let bright = (r + g + b) / 3

      //map the brightness 
      let w = map(bright, 0, 255, 0, vScale);
            noStroke();
      
      //if the bass from the song is less than 125 use grayscale
      if (boringBass < 125) {
        fill(bright);
        rectMode(CENTER);
        rect(x * vScale, y * vScale, bassMapped , bassMapped);

        //otherwise set color
      } else {
        fill(r, g, b, random(100, 255));
        ellipseMode(CENTER);
        ellipse(x * vScale, y * vScale, bassMapped , bassMapped );

        slideText;
      }
    }
  }
  
}