Body Politics: Financial Pressure Monitor

For our Body Politics assignment, Brent and I agreed from the beginning that we were interested in embodied capitalism: how the structures of capital shape and influence the physical body. We did a fair amount of research on this, and found that one of the clearest correlations between the two is found in blood pressure. There’s a litany of studies showing that socioeconomic status is linked intimately to blood pressure — those born into lower socioeconomic strata, with less economic, social, and educational advantages, are statistically significantly more likely to have higher blood pressure. Capital doesn’t just influence the phenotype of our lives: it literally goes as deep as our blood.

We wanted to draw attention to this relationship, and to the ways in which our culture seems to encourage it rather than work on any systemic level to improve it, so how we went about this was creating a machine that measures blood pressure and then, taking data from this study, recommends the increase in household income that would bring their blood pressure down to the mean (roughly $50,000 in household income per 0.61 decrease in systolic blood pressure). It invites the viewer to envision a world in which medicine treats the cause rather than the symptom, and to ask what that world ought to look like. Are we all to bootstrap ourselves to success, to find our own way to the income bracket that will make us healthy? Can we make a world where income is distributed more evenly, and, more importantly, can we make a world where income no longer has a causal relationship to health?

Model:

Topic: Embodied Capitalism

Device: Inversion

Mood: Clinical

Attribute: Interactive

The tech:

We used a Withings blood pressure monitor with the Withings HealthMate app. We then built a node server with express which gets the data from the Withings API and sends it to a mounted Arduino MKR1010, which parses the data, recommends a household income change, and displays it on an LCD screen. The code is here.

Ideation Process:

We went through a long ideation process before getting to this stage. Our first idea was an invasive health monitor: something like a FitBit that monitored, say, your bowel movements, or your sexual activity, to draw attention to the ways in which we give up our personal data — ownership of our own bodies — for capitalistic purposes: as documented, for example, in this TechCrunch article. After discussing this for a while, however, we came to the conclusion that this was a) too gimmicky and b) too broad. Were we critiquing self-tracking culture? Capitalism? Health insurance?

We went back to the drawing board, and decided that what we wanted to communicate, above all, was the influence of capital specifically on the physical body. From our research, one of the clearest ways the two are connected is through blood pressure: it’s a widely documented and studied phenomenon that SES has influence on hypertension. Our initial idea, as shown in the sketch above, was to use a blood pressure monitor to guess people’s socioeconomic status. However, this again was too vague a critique. The causal nature of the two wasn’t shown as clearly — in this model, you could prescribe almost anything that may correlate to blood pressure and it would have roughly the same effect. This is how we settled on prescribing an income change: it’s our hope that this makes clear the relationship we want to communicate, and the model we want to raise awareness of. Ultimately, the way to build a healthy body under capitalism is to earn more capital: the same incentive structures that motivate everything have infected the very blood that runs through our veins. A doctor could prescribe an income increase, and it might be as or more effective than taking actual medication. And on a systemic level, redistribution of income could resolve a documented public health crisis.

The viewer is invited to draw their own conclusions: what model should we be using to address public health? Capitalism certainly offers a solution, but is it the best one?

Building Process:

We wanted to create an object that wouldn’t seem out of place in a doctor’s office, to create an alternate form of a normal clinical interaction, so for our design we went for clear form and simple colors: white and metal. We repurposed a number of objects from Home Depot (socket covers, a pencil holder, and a scientific object stand), spray painted them, and assembled them.

Our biggest struggle here was with the tech. The Withings API, we discovered, has gone through many restructurings over the course of the company being acquired multiple times — the monitor we used is actually from 2011, so it’s also barely supported by the current version of their app. After much trial and error, we were able to authenticate our server with their API and get some simple data, but we were still hamstrung by issues: their API is extremely buggy and doesn’t update very often, so getting live data from the app on our server was functionally impossible. In addition to this, the app is so buggy that getting it to measure blood pressure is basically a matter of being lucky enough for it to detect the blood pressure monitor and not throw an error. While all of our tech works, getting through the whole process successfully without having to restart the app has about a 1 in 5 chance of happening. If we were to revisit the project, we’d either have to hack directly into the data coming from the monitor or try a different device. Ultimately, it was a fun and difficult experience worth having, but I think we plan on recommending to the program that they get a new and better monitor.

Critical Objects: Final Project

Background:

This final piece for Critical Objects is a piece that deals with the concept of environmental impact caused by rapidly expanding travel and leisure industries as well as the short sighted nature of people in the way that they view aesthetically beautiful things and the harmful impact they have on them. More specifically the persistent attempts of these companies to develop in Northern Lake Tahoe and the people who help to perpetuate this attitude.

Local organizations such as Sierra Watch and Keep Tahoe Blue have made it their mission to protect the Sierra Nevadas and Lake Tahoe from the various threats to the area. One such threat, that I am focusing on, is KSL Capital Partners. KSL purchased Squaw Valley in 2010 thanks to many mishaps in Placer County board approval of real estate development in the area. KSL then began to plan a massive development that would drastically change the region in terms of congestion, environmental pollution, and degredation of the natural landscape for leisure development.

Along with large companies causing damage, individuals can cause these environmental issues as well. One such example is the recent poppy super bloom in California in which many people are visiting to take photos of. In doing so many of these people are destroying the poppies by stepping on them.

The piece that I have created is focused on the the immediate aesthetic appeal that is caused by many of these situations. For example the people who take photos of themselves in the poppies do so, much of the time, while knowing they are harming the flowers. To them the immediate aesthetic enjoyability is more important than the long term preservation of this environment. I believe that this concept translates to what is going on in Lake Tahoe with KSL partners. Many of the people that will benefit from this expansion are those who are not local to Tahoe and only go for skiing vacations and other leisure activities.

Framework:

Topic: Expansion of Lake Tahoe/disrupting nature and the short sightedness of humans.

Attribute: disruptive

Device: Metaphor

Mood: confusion

Constraints:

PDLC Film x

No Acrylic

Execution:

This piece takes the form of a body of water, in this case meant to be representative of Lake Tahoe. When a person goes to view the tank they are recognized and a dark liquid is then poured into the tank. The longer the person looks at the tank the longer the liquid will pour into it. The person is made aware that the liquid is not good for the water but it creates an interesting dispersion effect that is enjoyable to watch. The criticality conveyed is that people have the choice to enjoy immediately interesting and positive aspects of things but need to accept that in some instances that process is more destructive than letting these processes or environments be left alone.

00100lPORTRAIT_00100_BURST20190513135908011_COVER.jpg
IMG_20190513_125918.jpg

Big LEDs: Final Installation Design

            My proposal for the LED Installation for an event stairwell is as follows.

1.     Have LED tape run along the lengths of each cut in hand rail (33.9175’ x 2)

a.     These lights will have linear movement in the direction the people walk when they go down the stairs.

b.     Color would be a variation of this palette:

2.     There will also be lights along the base of each stair, directed towards the bottom of the staircase (7’-9’’ x 24 strips).

a.     These lights will have various color gradient patterns running and the gradient will change react depending on the whether there is a person in their proximity or not.

3.     Along the middle landing I propose two additional strips, on the floor, running parallel to the direction the person will be walking.

a.     These strips will match the color patterns of the lights on the bottom of the stairs but they will have a different reactiveness when a person steps on the landing.

Plain View:

IMG_20190508_132823.jpg

Section View:

IMG_20190508_132838_1.jpg

Riser Diagram:

Final Project Proposal: Lake Tahoe Expansion and Environmental Impact Critique

Background:

For my Critical Objects Final I am proposing a piece that deals with the concept of environmental impact caused by rapidly expanding travel and leisure industries. More specifically the persistent attempts of these companies to develop in Northern Lake Tahoe.

map-lake-tahoe-area.jpg

Local organizations such as Sierra Watch and Keep Tahoe Blue have made it their mission to protect the Sierra Nevadas and Lake Tahoe from the various threats to the area. One such threat, that I will be focusing on, is KSL Capital Partners. KSL purchased Squaw Valley in 2010 thanks to many mishaps in Placer County board approval of real estate development in the area. KSL then began to plan a massive development that would drastically change the region in terms of congestion, environmental pollution, and degredation of the natural landscape for leisure development.

With this information in mind the critical object that I want to create will be focused on the negative implications that the potential development will bring in terms of accessibility/exclusivity to the region and clarity of the water. When I say accessibility I mean it in regards to the way that the development will cause the area to become even more difficult to access for people due to traffic congestion and tourism. By exclusive I mean that these developments only add to the rising prices in Tahoe which has been causing local residents to be priced out of the area and add to the rising prices of access to the various ski resorts. Clarity refers to the clarity of the lake. Over the past 30 years Tahoe has been losing its water clarity due to pollution. Much of the pollution is from inorganic sediments being put into the lake through various means but the added car traffic and habitation to the area from KSL’s planned development would be a major factor as well.

SecchiDiskChart2017.jpg

Inspiration:

Inspiration for this piece comes from a few different sources. Phillipp Schmitt’s Camera Restricta for one made me think of possibilities in working with the exclusivity concept.

restricta-camera.jpg

Other devices that are inspiring to this piece are the ones built for on the ground protesting in either the sense of lock on devices or pieces being built for group protests.

I am still not completely sure what I want the physicality of the piece to entails but I know that I want to push it towards the disobedient object end of the critical spectrum. It is important for me to take a step back from the type of objects I have been making this semester and make something that is less about a system and is more of a purposeful device or object.

Framework:

I will be working alone on this project.

Topic: Over-expansion of North Lake Tahoe and Squaw Valley

Attribute: disruptive

Device: Metaphor

Mood: annoyance

Constraints:

Must use PDLC film & not use any acrylic or environmentally unfriendly materials. Reasons behind these constraints are that using PDLC film works along with the idea of clarity and forces me to go down that route. Since the piece is about environmental damage and expansion I don’t want to use any materials that cause harm to the environment to a reasonable extent so I wont be using any acrylic for this piece.

DMX Driven LEDs

For this Big LEDs assignment we were put in groups of three and given instructions to use the leDMX4 pro, Meanwell power supply, and madmapper to drive a roll of LED tape in whatever configuration we wanted. Kemi, Adrian, and I worked together to complete this.

Screen Shot 2019-04-24 at 5.32.23 PM.png

We decided to arrange the LED strip into a bowl and diffuse it with an interesting material to see what type of effect we would get.

We ran into a few different issues when doing the initial setup of the dmx controller and connecting it to madmapper. The overall issue was getting it to connect to the computer’s network. After trying a few different dongles and converters we were able to get it to work using an ethernet to thunderbolt adapter. The rest of the process was fairly easy and straightforward.

Overall I felt like this assignment gave me a good intro to DMX and its’ versatility in controlling lighting elements.

IMG_20190422_131713.jpg
IMG_20190422_141935.jpg

Live Image Processing Final Proposal

For the final LIPP show Anna and I will be working together to create a performance that takes on the idea of digitization of humanity and how this could look in an alternative and or futuristic universe. Techniques covered for the performance will range from GL, face meshes, pre recorded and live microscope feeds, and audio manipulation.

The concept is a combination of concepts ranging from organic matter taking on and metamorphosing it into a digital presence to artificial intelligence and how these systems are becoming increasingly humanlike.

Below are a few examples of visual styles we are going to try to incorporate into the piece.

The Performance is broken up into four acts:

Act I:

1. Morgan and Anna both at table in middle.

2. Computer Voice: “Baseline test has now begun” (something similar signifying test on organic life has taken place).

3. Morgan taking samples from both himself and Anna. Anna helping hand him petri dishes, etc.

4. Max visuals of organic life in petri dishes on projector.

Act II:

1. Start of distortion in cells (shown through visuals on screen).

2. Computer Voice: “Anomaly Detected”.

3. Cell distortion continues & increases.

4. Narrative Baseline test of repeating words, answering questions between Morgan and Anna with both of them saying different answers. Audience is unsure of which one is the anomaly (or which answer is the right one to the test).

Act III:

1. Computer says one of them did not pass test (reword later).

2. Computer Voice: “Warning, infected subject is highly contagious”.

3. Fade out of microscope (cell) visuals.

4. Anna and Morgan’s face side by side on each screen. No distortions yet.

5. Anna and Morgan’s face both start to distort (openGL face mesh, and other manipulation techniques).

Act IV:

1. Anna and Morgan's Face start to merge together and distort into one abstract entity.

2. Abstract form/shapes show up on the screen (shaders).

3. Computer Voice: saying something about test being resolved. (the abnormal/non human won).

4. END.

Deep Web: Reflections on the functions and quality of light

For this assignment I decided to analyze the functions and qualities of light of the kinetic audio visual installation and performance: Deep Web, by the design studio, WhiteVoid.

Composition:

The composition of the installation can be broken down to a few different structures. The main two being the large led bulb system hanging above the viewers. This system moves in various synchronous patterns while being pointed to by the second structure. The second being laser arrays on the side of the room which direct their beams at the led bulbs and change color and direction reactively to the score.

Visibility:

As the installation is hanging above the crowd it allows for unique perspectives of viewing from different locations throughout the large venue. I imagine being underneath the piece while it is moving and focusing on the bulbs to be hypnotizing in its’ own way and if you move the the edge of the piece there is an entirely different perspective achieved from the piece.

Direction:

Deep Web is modeled in a way that seems as thought it would make sense to be immediately below it but then really moving around and seeing other perspectives of the piece allow the viewer to have a fuller view and appreciation of it.

Focus:

The audiences attention is focused above them the entire time. With it constantly being redirected from the moving led array to the lasers flashing and moving. I can imagine this having the added effect of allowing people to fully immerse themselves in the piece easier than if they were looking horizontally in front of them.

Mood:

For me the mood of the piece from the light’s perspective has to do with the narrative of the piece. It begins in a very minimalistic way and then progresses into more extravagant displays over the course of the performance. From this the mood is able to change in intensity and lightness. Adding the audio element into the piece adds to the overall mood and atmosphere of the piece.

Critical Objects Midterm: Instagram Accountability System

Guidelines:

For the Critical Objects Midterm assignment students were given the option to choose any topic to make a project from. I decided to do the following:

Topic: Voluntourism(Unsustainable volunteering)

Device: Juxtaposition

Attribute: Interactive

Mood: Surprising

IMG_20190325_165420.jpg

The piece that I decided to make was based around the statistics that show how unsustainable volunteering or voluntourism has long term negative impacts on communities affected. Some of these impacts include economic losses and high turnover at orphanages.

I wanted to highlight how many people who practice this type of tourism post their activity on social media, in doing so they help to perpetuate this kind of behavior and thoughtlessness. When people like posts that show this activity they encourage the person posting to continue down that path.

Taking the idea that social media has an impact on this behavior I created the Instagram Accountability System. I created an instagram page that aggregated a few photos of unsustainable volunteer activity and when any of the photos in the account were liked enough times a brick would fall and break a clay plate. In this case, the brick, traditionally thought of as an object to help build, is used destroy an object.

Screenshot_20190401-024548.png

Technical Elements:

The first part of the project began with me writing the server client to connect to the instagram API. Initially I planned to do the entirety of the project in Arduino to minimize any unnecessary coding or data sending. After a week of attempting to use the Arduino for the project I made the decision to write my client to stream the instagram data to. There I would send the data to a node server I had written and finally that data would be sent to Arduino where it would be parsed and used to control a pulley system.

Some of the biggest challenges in the programming portion of the project were the initial attempts at trying to do everything using an Arduino. It was very difficult to do and frustrating at times. Other challenges were sending the instagram data back to the Arduino. The number of likes had to be sent as a string but couldn’t be uploaded to my server that way and that caused a good amount of time being devoted to figuring out a solution. The final challenge was dealing with the string once it was sent to Arduino. converting a string that is an array of numbers was massivly annoying and in the end I wasn’t able to completely do it. I had to take the hex values of each and compare them to control the motor pulley system I had built.

The initial Pulley System I had built used a servo with a blade attached to it to cut a rope but the system took too long to cut the rope and was unreliable. So I switched to a single throw pin pull system which worked much better.

Fabrication:

The fabrication portion took the least amount of time. Most of it involved ensuring that the brick would be able to hang safely and not break the system. A good amount of laser cutting, metal cutting, and wood glueing took place for the final product.

Final Thoughts:

Overall, I am happy with how the piece turned out. I think the level of abstractness works for it and the concept is solid. If I were to iterate on it I would definitely redesign the look of the enclosure and possibly what was being crushed. I learned a lot from this project and I think I hit a spot pretty close to the middle of the Critical Triangle.

Heres a video of the piece in action:

Arduino Code:

/*
 ITP Critical Objects Midterm Spring 2019
 Morgan Mueller
 Get request from Node Server and parse the string sent over, then control motor.
*/
#include <ArduinoHttpClient.h>
#include <WiFiNINA.h>
#include "arduino_secrets.h"
#include <Servo.h>

///////please enter your sensitive data in the Secret tab/arduino_secrets.h
/////// Wifi Settings ///////
char ssid[] = SECRET_SSID;
char pass[] = SECRET_PASS;

char serverAddress[] = "10.17.79.241";  // server address
int port = 8000;

WiFiClient wifi;
HttpClient client = HttpClient(wifi, serverAddress, port);
int status = WL_IDLE_STATUS;
boolean forward = true;

Servo myServo;
int angle;
int increment;

int ControlPin = 3;   //give your arduino pin a name


unsigned long ts = millis () ;   // time accounting.
#define DELAY 20

void setup() {
  Serial.begin(9600);
  while ( status != WL_CONNECTED) {
    Serial.print("Attempting to connect to Network named: ");
    Serial.println(ssid);                   // print the network name (SSID);

    // Connect to WPA/WPA2 network:
    status = WiFi.begin(ssid, pass);

    ///////// Servo code /////////////
    myServo.attach(2);

    pinMode(ControlPin, OUTPUT); // initialize the digital pin as an output.



  }

  // print the SSID of the network you're attached to:
  Serial.print("SSID: ");
  Serial.println(WiFi.SSID());

  // print your WiFi shield's IP address:
  IPAddress ip = WiFi.localIP();
  Serial.print("IP Address: ");
  Serial.println(ip);
}

void loop() {


  Serial.println("making GET request");
  client.get("/");

  // read the status code and body of the response
  int statusCode = client.responseStatusCode();
  String response = client.responseBody();

  // Serial.print("response 1: ");
  //Serial.println(response.length());
  //response.remove(0,1);
  int opening = response.indexOf('[');
  int closing = response.indexOf(']');
  response.remove(opening, 1);
  response.remove(closing);

  for (int i = 0; i < response.length(); i++) {
    int commas = response.indexOf(',');
    int otherCommas = commas + i;
    response.remove(commas, 1);

  }



  // response.toInt();
  Serial.print("this is the first position of response: ");
  Serial.println(response[0]);
  //Serial.print("this is the second position of response: ");
  //Serial.println(response[1]);

  //   char buf [] = response.length() + 1;
  int len = response.length() + 1;
  //response.toCharArray(buf, sizeof(buf));

  //int testRes1 = atoi(&response[0]);
  int testRes2 = response[1];
  //int testResTotal = testRes1 + testRes2;


 //for (int k = 0; k < response.length() - 1; k++  ) {
  if (response[0] == 51 ) {
    Serial.println("SPIN TO WIN");

    digitalWrite(ControlPin, HIGH); // turn the motor on by making the voltage HIGH

    //
    //    for ( angle = 0; angle < 180; angle++) {
    //      myServo.write(angle);
    //      //increment = angle;
    //      delay(15);
    //      if (angle == 0) {
    //        for (increment = 180; increment > 0; increment--) {
    //          myServo.write(increment);
    //          increment = angle;
    //          delay(15);
    //        }
    //
    //
    //      }
    //    }
    //    // now scan back from 180 to 0 degrees


  }
 //}


  Serial.println(testRes2);
  Serial.print("Status code: ");
  Serial.println(statusCode);
  Serial.print("Response: ");
  Serial.println(response);
  Serial.println("Wait five seconds");
  delay(5000);
}

Javascript Code:

var express = require('express');       // include express.js
var server = express();           // a local instance of it

var likes = [];
// [{ id: STRING, likesCount: INTEGER }, ...]

var newBody;

var url = 'https://api.instagram.com/v1/users/self/media/recent/?access_token=12161955647.a06c9e9.3d22233be15b4b72822cba3386950c3b&likes';


const request = require("request");


//----------------------------------------------------------- Handles Instagram

// this checks the API and populate the # of likes
function checkLikes(){

  request.get(url, (error, response, body) => {


    try {
      let json = JSON.parse(body);
      if (json && json.data) {
        likes = [];
        json.data.forEach(function(element) {
          likes.push(element.likes.count)
        });
      }
    } catch (error) {
      console.log("OMG ERROR", error)
    }

    })

  //});
  // Add the code to check the API (Instagram)

  // Populate the array of likes
  //likes = [];
}
// Check the API every **** ms



//----------------------------------------------------------- Handles Arduino

// this runs after the server successfully starts:
function serverStart() {
  var port = this.address().port;
  setInterval(checkLikes, 30000);
  console.log('Server listening on port '+ port);
}

// this is the handler for the root of the site:
function getRoot(request, response) {
    response.json(likes);                   // send # of likes back to the client
    // response.end();                  // close the connection
}

// start the server:
server.listen(8000, serverStart);
server.get('/', getRoot);                           // GET the root of the site

Digital Meiosis, First Performance

Last week was our first performance for Live Image Processing. When I was thinking about the piece I wanted to perform I wasn’t really sure what direction I wanted to go in. For previous assignments I had made things that were visually interesting to me but not really meaningful.

After talking with Matt about my concerns with the performance he gave me some good advice about experimenting and that it didn’t necessarily have to be meaningful. I began to experiment with a few different concepts. The first using stock political footage and voiceover from various politicians. This idea didn’t really take me anywhere interesting so i decided to switch to using old home video footage. This idea eventually led me to finding footage of cells splitting under a microscope.

From here I started working on a piece that began as a very organic process happening and then over time begin to take the form of something extremely inorganic and geometric. Overall I was pretty happy with how the piece turned out given the short performance window. I enjoyed the element of performing the piece and it made me more excited for the next two performances. Below is a test run through of the piece. It varies from the final performance but it follows most of the same timing and structure.

And some screenshots of the patch and presentation. The patch was broken into four different sections and then I crossfaded between each section except for one.

Screen Shot 2019-03-04 at 3.38.38 PM.png
Screen Shot 2019-03-04 at 3.38.50 PM.png

Predictive Policing Awareness Machine

For this assignment we were tasked with choosing a topic to design and build a critical object about. Teams were also given sets of requirements for the created piece. The piece Gilad and I made was required to use an organic element, convey surrealism, and emote anxiety.

After a lot of brainstorming we decided to work around the topic of systemic bias, more specifically that of predictive policing and their use of artificial intelligence. This piece from propublica was eye opening to us and helped to shape the framework of the awareness machine we created.

The critical object we created was based around the idea of of a feedback cycle where data that has an innate bias in it continually gets fed into a system where the output seems uncontaminated but contains remnants of the biased data within it.

The system we created was an organic representation of those black box systems. Having a dropper of durian essence, an extremely pungent fruit that can fill a room quickly, fall onto a flower which acts as the output of the system. The liquid then creates an odd sensation where a flower which is usually associated as beautiful and pleasant smelling is now revolting. The final piece of the system is that the flower has a chance of falling into a blender and being chopped up. This is partially to make a statement about how the American criminal justice system can be a crap shoot and to say that it can have permanent effects on those who fall into it for one reason or another.

We wanted to have the final design of the piece be minimalistic and have an overall sterile look to it. The thinking behind this being that an association with this look and systems of various kinds would occur.

The technology behind the piece was fairly simple. Using a Finite State Machine built in arduino code we were able to asynchronously control a high torque servo and then use a powerswitch tail to control the blender. Everything was controlled using Arduino.

Some pieces were laser cut to help add to the design.

A final video of the working piece can be found below.

Arduino Code:

#include <ArduinoHttpClient.h>
#include <WiFiNINA.h>
#include <ArduinoHttpClient.h>
#include <SPI.h>


#include "arduino_secrets.h"
///////please enter your sensitive data in the Secret tab/arduino_secrets.h
/////// Wifi Settings ///////
char ssid[] = SECRET_SSID;
char pass[] = SECRET_PASS;
char sessionKey[] = SECRET_KEY;
String MAC = SECRET_MAC;

const char serverAddress[] = "tigoe.io";  // server address
String route = "/data";
// set the content type and fill in the POST data:
String contentType = "application/json";
int port = 443;
int sensorPin = A1;

//temperature values
String tempString, newPost, tempData;


byte mac[6];
WiFiSSLClient wifi;
HttpClient client = HttpClient(wifi, serverAddress, port);
int status = WL_IDLE_STATUS;

void setup() {
  Serial.begin(9600);              // initialize serial communication

  // while you're not connected to a WiFi AP, attempt to connect:
  while ( WiFi.status() != WL_CONNECTED) {
    Serial.print("Attempting to connect to Network named: ");
    Serial.println(ssid);           // print the network name (SSID)
    status = WiFi.begin(ssid, pass);  // try to connect

    }

  WiFi.macAddress(mac);

  // print your WiFi shield's IP address:
  IPAddress ip = WiFi.localIP();
  Serial.print("IP Address: ");
  Serial.println(ip);
  Serial.print("Mac Address: ");
  Serial.println(macToString(mac));

}

void loop() {


    //String tempData;

  int sensorReading = analogRead(sensorPin);

  float voltageReading = sensorReading * 3.3;
  voltageReading /= 1024.0;

  //temperature in Celcius
  float tempC = (voltageReading - 0.5) * 100;

  Serial.print(tempC); Serial.println(" degrees C");    tempData = String(tempC);

  //
  //  Serial.println("making POST request");
  //
  //  // send the POST request
  //  //client.post(path, contentType, postData);

  postData(tempData);



  // read the status code and body of the response
  int statusCode = client.responseStatusCode();
  String response = client.responseBody();

  Serial.print("Status code: ");
  Serial.println(statusCode);
  Serial.print("Response: ");
  Serial.println(response);
  client.stop();    //close request
  while (true);  // stop



  Serial.println("Wait ten seconds\n");
  delay(10000);
}



void postData(String newData) {
  //  // have to insert extra \ before " in data JSON,
  //  // since it's a JSON string within a JSON string:
  //  newData.replace("\"", "\\\"");

  //  // the template for the body of the POST request:
  //   String body = " {\"macAddress\":\"MAC\",\"sessionKey\":\"KEY\",\"data\":\"DATA\"}";
  ////  Serial.println(body);
  //  // replace the template placeholders with actual values:
  //  body.replace("MAC", macToString(mac));
  //  body.replace("KEY", sessionKey);
  //  body.replace("DATA", newData);
  //  Serial.println(body);
  //  // make the request
  //  //body = "{\"macAddress\":\"84:0d:8e:34:a9:70\",\"sessionKey\":\"31b2cc54-e15d-4c89-9b38-474490983388\",\"data\":\"temperature:26.70\"}";
  //  client.post(route, contentType, body);

  tempString = "{'\"temperature'\":\"";
  tempString += tempData;
  //tempString += "\", \"tempData\":\"";

  newPost = "{\"macAddress\":\"";
  newPost += MAC;
  newPost += "\", \"sessionKey\":\"";
  newPost += sessionKey;
  newPost += "\", \"data\": ";
  newPost += tempString;
  newPost += "}";
  Serial.println(newPost);
  client.post(route, contentType, newPost);


}

void getData() {
  // set the content type and fill in the body:
  String contentType = "application/json";
  // the template for the body of the POST request:
  String body = " {\"macAddress\":\"MAC\",\"sessionKey\":\"KEY\"}";
  // replace the template placeholders with actual values:
  body.replace("MAC", macToString(mac));
  body.replace("KEY", sessionKey);

  // make the request:
  client.beginRequest();
  client.get(route);
  client.sendHeader("Content-Type", "application/json");
  client.sendHeader("Content-Length", body.length());
  client.beginBody();
  client.print(body);
  client.endRequest();
}

void temperatureCheck(){

  
  }




String macToString(byte mac[]) {
  String result;
  for (int i = 5; i >= 0; i--) {
    if (mac[i] < 16) {
      result += "0";
    }
    result += String(mac[i], HEX);
    if (i > 0)  result += ":";
  }
  return result;
}

Network Connected Thermostat, trials and accomplishments

The final project for Connected Devices was to create an IOT thermostat that would record the temperature in whatever room it was placed in and then send that temperature to a remote server. The thermostat was required to have a working UI and post the temperature to the server once every hour for a week straight.

After a lot of trial and error I was able to achieve this using the MKR 1010 microcontroller, much assistance from Tom Igoe, and Koji, and a few repos listed below.

https://github.com/tigoe/Wifi101_examples/blob/master/ConnDevClient/ConnDevClient.ino

https://github.com/tigoe/MakingThingsTalk2

The initial setup of the MKR1010 took quite a while. Having to make sure that the most up to date firmware was present on the device, adding the mac address to itpsandbox, and then receiving a static ip address for the device. Once all of this was done and I was assigned a specific session key I could begin really working.

I made the decision to use dweet to test that I was sending data correctly. First I wanted to test the server with a static value and once that sent properly I set up the temperature reading circuit and sent the temperature data to dweet. Both of those worked correctly so I moved onto the real thing.

Screen Shot 2019-02-22 at 3.11.27 PM.png
Screen Shot 2019-02-23 at 11.02.48 AM.png

This is the part of the assignment where I wish I could say everything worked perfectly the first try but I ended up spending more time than I care to admit trying to fix an error that made little sense to me. Multiple people attempted to help and it wasn’t until Tom took a look at my code and realized that in my call to the server address I had added “https://” this created a -2 HTTP Status in my code even when everything looked like it was sending to the server properly.

Once this issue was fixed I got my first 400 Status, this wasn’t ideal but I was still elated because at least I knew that it was because of a JSON formatting error rather than some unknown.

Screen Shot 2019-02-24 at 3.07.31 PM.png

After a lot of tweaking to the JSON I was sending I finally got a 201 status.

Screen Shot 2019-02-24 at 6.57.42 PM.png

Curling to the server proved that I was successful and could focus on doing a bit of fine tuning and working on the UI. I decided not to invest as much time in that aspect so I stuck to a simple LCD screen and potentiometer combination. The LCD displays the current room temperature in Celsius and the potentiometer simply turns the temperature display on or off depending on the state of rotation.

IMG_20190225_192227.jpg

Here is a video of the working device:

A link to the code repo can be found here: Link

Additions to Playback System

This week I wanted to get deeper into using newer objects and seeing if I could work to create glitchy distortion effects. At this point I want to try to push myself to understand why certain effects and objects create such alarming visuals to try to work backwards so that I can subtly bring in these types of visuals in a less immediately jarring way during my performances.

At this point, I think the things I am struggling with the most is keeping track of why objects are having certain effects when I implement them farther down the patch. For example, the more complex my patch has become the less I understand why adding a certain effect, that I think should cause a video to be manipulated in one way, causes something unexpected and confusing. That along with just keeping track of the data in a way that I can remember how to make sure I’m not missing planes etc. These seem like they will take time so I’m just enjoying the learning process.

For my performance I’ve been thinking about the phrase "I am” and how it relates to the way I have been feeling since starting ITP. The way I have defined myself or just thought about who I am over the past few years seems to be changing constantly and over the past few months even more so. I’m not sure how I would convey that through the performance or even if it would be right for the performance but it’s something I’ve been thinking about. I also like the idea of just manipulating a bunch of videos/photo headshots I’ve taken of people.

My patch was broken up into three main parts. The middle and right parts was a modified version of what I had done last week. Modified by adding in msp signals and cleaning up a few of the less optimized areas.

The leftmost section was new. The idea was to make a sort of jumpy glitch effect that I could overlay between recorded video and screen caps of myself. I found a page online that showed how to make a nice glitch effect but I honestly didn’t understand most of what was going on so I broke it down and spend some time figuring it out piece by piece. I’m still not totally sure what is happening in it but I figured out enough to make my own frankenstein version. Finally I took all the patches and made a presentation mode overlay so I could work with them and blend them.

Video Playback System

The Assignment for this week was to begin building a video playback system. My goals were to get more comfortable with a lot of the objects and processes that we had gone over in class but also to try experimenting and seeing what else was possible.

Most of the issues I had were that I’m still not totally sure how a lot of objects work, such as chromakey and xfade, so understanding why the output acts the way it does is a bit difficult. I find the visual programming to be quite intuitive and fun though. The major issue I had was that, even using window instead of pwindow, when I loaded in videos to the program the framerate dropped significantly and there was a lot of lag. This is why I used live webcam footage for my documentation.

The patch that I created is broken up to four different parts.

The first part of the patch is really responsible for unpacking the video’s values and then sending the rgb out to the second portion. Once they return from part they are repacked and sent into a gswitch for toggling between color and b&w. the second portion of the patch manipulates the zoom levels and anchor points of the first live video before sending it back. For this portion, Matt Romein’s sample patches from week 2 were used. Part 3 uses chromakey and rota to manipulate a second video feed. Finally the fourth part used xfade to fade the two videos together as the user would like.

Screen Shot 2019-02-14 at 8.18.22 AM.png

Part 1:

Screen Shot 2019-02-14 at 8.26.58 AM.png

Part 2:

Screen Shot 2019-02-14 at 8.25.54 AM.png

Part 3:

Screen Shot 2019-02-14 at 8.27.05 AM.png

Part 4:

Creating a basic Web Interface

This week’s assignment was to create a web interface for an existing web-connected consumer device. In class we specifically went over the hue bulb system and that is the device I created the web interface for. I began the assignment by going over the documentation provided to us regarding the hue system, assigned IP, and clip api. After getting my login credentials from the clip API I began doing basic calls of the hue bulb. i.e turning it on and off and changing the colors manually.

Screen Shot 2019-02-11 at 7.01.31 PM.png

After playing around with this system for a bit I began to focus my attention on creating my own interface. I referenced the hue control repo from Tom Igoe, and node hue api repo from Peter Murray. I had difficulty understanding portions of getting the api to work so it took a good amount of time to really wrap my head around what was happening. Timothy Lobiak was also extremely helpful in giving me guidance and sharing code snippets on how to create the colorMode sliders and storing values.

I ended up deciding to use p5.js due to my familiarity with the DOM elements. Even with this I was a bit rusty so with the language. My goal for functionality was to be able to turn the light on and off with a button, then have three sliders that controlled the hue, saturation, and brightness and have the bulb change state in real time depending on the position of each slider.

this is the basic interface functioning.

Thoughts on Chelsea Gallery Viewing

Of all the galleries visited this week, Paul Stephen Benjamin’s Pure, Very, New struck me the most. More specifically the portraits that he shot of people and then printed on an extremely black paper. At first glance the photos look like a textured paper but upon further inspection you begin to notice the shapes of the subject. As I approached the pieces I was able to see how detailed the subjects were. From the textures of their skin to the detail of their facial expressions. It took me a few minutes to take in the pieces and it really made me question the process he went through to obtain the pieces.


It really took me being there in person to be able to appreciate those pieces. Something about them is lost without seeing them in that form. I felt that way with a majority of this exhibition though. The mixture of black lights, single channel tvs, and less technological art gave it a flow that I found hard to describe.

I found myself comparing this exhibit to the Borders exhibit at the James Cohan Gallery. Something about the contrast between seeing artworks dealing with violence, humanity, and social issues and then comparing them to those black large scale paintings, which made me think of black holes, gave off a feeling of endless trouble and problems for society.

MTA Service Disruptor

For Critical Objects assignment 2 we were required to create a disobedient object. Meaning an object that serves as a form of social or political protest. The focus for this project was focused more on the critical critique and technology elements.

When thinking about a subject to create a disobedient piece towards I tried to think locally and more specifically towards something that directly impacts me. For all of the things I love about NYC there are plenty of things I can’t stand about it. After living here for three years my tolerance for the MTA has decreased exponentially.

Screen Shot 2019-02-10 at 6.10.50 PM.png

This being so, Adi and I decided to create an object that would be a disruption to the MTA. We wanted the piece to only be noticeable and cause a disturbance when trains do not run on schedule Out of this came the MTA Service Disruptor.

The idea behind the device is that someone protesting the MTA would take the piece and plant it secure location facing the train tracks. The device would then be silent as long as trains were running on the correct schedule. As soon as the trains began to run off schedule then the device would start making an irritating beep sound. The longer the time until the next train the louder and more obnoxious until the trains return to their normal schedule.

The concept is that the Service Disruptor will force people to begin to discuss whats going on and then keep the issue relevant for as long as necessary. The goal of the piece is not to solve the issues with the NYC subway system but to keep the topic relevant and make New Yorkers more hostile towards the heads of the MTA.

IMG_20190210_163409.jpg

The Service Disruptor uses a basic circuit comprised of an ultrasonic sensor, basic speaker and small arduino microcontroller. The small form factor and simplicity of the piece makes it ideal for creating many of them and planting all over the city at different stations.

IMG_20190210_181757.jpg

Code

#include <toneAC.h>

int trigPin = 5;
int echoPin = 6;

void setup() {
    Serial.begin(9600);
    pinMode(trigPin, OUTPUT);
    pinMode(echoPin, INPUT);
}

int rangeThreshold = 5000;
unsigned long lastSubwayTime = 0;
int expectedTrainSchedule = 5000; // ms

void loop() {
    float rangeDuration;
    unsigned long now = millis();

    // send a 2 microsecond pulse to start the rangefinder
    digitalWrite(trigPin, LOW);
    delayMicroseconds(2);
    digitalWrite(trigPin, HIGH);
    delayMicroseconds(10);
    digitalWrite(trigPin, LOW);

    // read from the rangefinder
    rangeDuration = pulseIn(echoPin, HIGH);

    // occasionally print sensor value
    if (now % 3 == 0) {
        Serial.print("d: ");
        Serial.println(rangeDuration);
    }

    float toneFrequency = 800;
    float toneVolume = 10;

    if (rangeDuration < rangeThreshold) {
        lastSubwayTime = now;
    } else if (now - lastSubwayTime > expectedTrainSchedule) {
        playNotes(10, 500);
    }
}

int notes[2] = { 800, 400 };
unsigned long lastNoteTime = 0;

void playNotes(int volume, int duration) {
    int i = 0;
    while (i < 2) {
        unsigned long now = millis();
        if (now - lastNoteTime > duration) {
            lastNoteTime = now;
            toneAC(notes[i], volume, duration);
            i++;
        }
    }
}

Creating an Image out of physical pixels

For the first assignment we were required to create an image using “physical pixels," preferably not using a computer or other digital technologies.

When I began thinking about this assignment I started by thinking of Jackson Pollock like painting and using a material that would allow for a more abstract image. From this idea I decided to use spices from my kitchen to make the image.

Below are the spices and final image I created.

edited.jpg

Using a teaspoon and glue I spread each spice out and then shook the paper out. After doing this a few times I realized that I didn’t really like the outcome of the piece and knew that I wanted to try something else.

I decided to try to use cheerios to trace over a preexisting image. Each single cheerio would act as an individual pixel in this case. I chose a vector image of Andy Warhol to trace over.

I honestly didn’t like the way this image came out even more than the one I made with spices. Working with the cheerios wasn’t easy and I had to cut a lot of them to get a somewhat correct image representations. The fact that I had to cut the cheerios kind of takes away from the idea that each represented a pixel but oh well.

If nothing else this assignment reinforced the idea that it is difficult to make work one “pixel” at a time.

Assignment 1: Short Video Clip Repository

Process

For the first assignment we had to record 5-10 minutes of short video clips to use as a sample bank. Factors for us to think about were less about narrative and theme and more about visual elements like Light, Color, and Shape. I come from a film photography background and in that type of work I usually try to think think about the visuals of an image I’m composing as well as theme and developing a narrative. Thinking in a different context proved challenging but interesting.

The videos I ended up taking were mostly clips from walking around my neighborhood in Bushwick, to work in Manhattan, and to and from ITP. During these walks I tried to pay attention to light, texture, and juxtaposition, both within the single video and between other clips I had taken. I tried to vary the clips in terms of scale of focus once I found a rhythm with the video taking process.

Clips

Below are a few of the clips I recorded:

Assignment 1: Creating a Web Server

Concept

The first assignment for Connected Devices required us to make a basic server using Node.JS and Express. For the server i drew heavily on example code used in class last Tuesday and also used Tom Igoe’s Four Line Server example. Finally the Express.JS routing guide was extremely helpful.

For my server I decided to make a basic remote thermostat that would let a user control the temperature in their apartment as well as see/change the state of the thermostat. i.e whether it is in heat/cold/off state.

The four REST endpoint I utilized are as follows:

1. /state lists the current state of the thermostat (heat, cold, off)
2. /state/(off,heat,cold) inputing any of the three states will change the current thermostat states
3. /temperature lists the current temperature inside of the apartment
4. /temperature/(increase,decrease) will either increase or decrease the current temperature

Issues

My biggest issues with completing the assignment came from my lack of experience with html and getting a better interface/more functionality out of the server. For Example I would like the user to be able to be prompted to manually set the temperature if they like. As I learn more about Node.js/Express and interfacing I know these types of issues will subside.

Code

Below is the code broken down into sections:

Here is a link to the Github Repo

//four line server example created by Tom Igoe
var express = require('express');
var server = express();
server.use('/',express.static('public'));
server.get('/temperature',temperature);
server.get('/temperature/:changeTemp',modifyTemp);
server.get('/state',checkState);
server.get('/state/:newState',changeThermState);
server.listen(8080);

//inital state of the Serverstat
var thermostatState = 'off';
var thermTemp = 64;

/*
CheckState is responsible for checking the current state of the Serverstat(either heat/off/cold) and the current
temperature in Farenheit
*/
function checkState(request, response){
  response.send('the current state is: ' + thermostatState + ', at a temperature of ' + thermTemp + ' degrees Farenheit');
  response.end;
}

/*
changeThermState allows the user to manually change the state of the thermostat when they enter
one of the three preset states
*/
function changeThermState(request, response){
  var newThermState = request.params.newState;
  if(newThermState == 'off' || newThermState == 'cold' || newThermState == 'heat'){
    thermostatState = newThermState;
    response.send('the state has been set to ' + thermostatState);
    }
    else{
      response.send('please only input "off/cold/heat"');
    }
    response.end;
}
/*
temperature sets a random temperature value to the output when the state is changed.
*/
function temperature(request, response){
  if(thermostatState == 'heat'){
    var temp =  Math.floor(Math.random()*(80-70+1)+70);
    if(temp <= 69){
      temp = 73;
    }
  }

  if(thermostatState == 'cold'){
    var temp =  Math.floor(Math.random()*(65-55+1)+55);
  }

  else{var temp =  64;}
  thermTemp = temp;
  response.send('the current temperature inside the apartment is: ' + temp + ' degrees Farenheit');
  response.end;

}
/*
modifytemp allows the user to either increase or decreaase the temperature depending
on their preferences
*/
function modifyTemp(request, response){
  var tempChange = request.params.changeTemp;;

    if(tempChange == 'increase'){
    thermTemp++;
    response.send('the temperature has been increased to ' + thermTemp);
    }

    else if(tempChange == 'decrease'){
      thermTemp--;
      response.send('the temperature has been decreased to ' + thermTemp);
    }
    else{
      response.send('Please input either "increase/decrease"');

    }
    response.end;

}