Final Pcomp Project

For my final Physical Computing assignment, I’ve created gloves that generate different light, sound and animation effects according to the gestures of the performer that embodies the Fire Bird to tell their story. For more details about the storytelling part of the project, click here.

Video by Tundi Szasz
Different LED effects coordinate to hand gestures

Find below my Arduino Code:

//The Fire Bird

//This code was created during Fall 2019 at ITP NYU by Fernando Gregório


//p5.js code: https://editor.p5js.org/fernandogregor-io/sketches/2wKt5wr64
//Main sources used to create the code:
//For the Serial Communication with the Arduino IMU:
//https://itp.nyu.edu/physcomp/labs/lab-serial-imu-output-to-p5-js/
//For the Sound Frequency Modulation 
//https://p5js.org/examples/sound-frequency-modulation.html
//For the LED's: Maxwell da Silva (link github)
//Coding assistance: Maxwell da Silva


#include <Arduino_LSM6DS3.h>
#include <MadgwickAHRS.h>

// initialize a Madgwick filter:
Madgwick filter;
// sensor's sample rate is fixed at 104 Hz:
const float sensorRate = 104.00;

// values for orientation:
float roll = 0.0;
float pitch = 0.0;
float heading = 0.0;



// A basic everyday NeoPixel strip test program.

// NEOPIXEL BEST PRACTICES for most reliable operation:
// - Add 1000 uF CAPACITOR between NeoPixel strip's + and - connections.
// - MINIMIZE WIRING LENGTH between microcontroller board and first pixel.
// - NeoPixel strip's DATA-IN should pass through a 300-500 OHM RESISTOR.
// - AVOID connecting NeoPixels on a LIVE CIRCUIT. If you must, ALWAYS
//   connect GROUND (-) first, then +, then data.
// - When using a 3.3V microcontroller with a 5V-powered NeoPixel strip,
//   a LOGIC-LEVEL CONVERTER on the data line is STRONGLY RECOMMENDED.
// (Skipping these may work OK on your workbench but can fail in the field)
#define TIMECTL_MAXTICKS  4294967295L
#define TIMECTL_INIT      0
long time;
unsigned long flashTimeMark = 0;
unsigned long flashTimeMark2 = 0;
long interval = 2000;
int periode = 2000;
long previousMillis = 0;
int alpha;
#include <Adafruit_NeoPixel.h>
#ifdef __AVR__
#include <avr/power.h> // Required for 16 MHz Adafruit Trinket
#endif

// Which pin on the Arduino is connected to the NeoPixels?
// On a Trinket or Gemma we suggest changing this to 1:
#define LED_PIN    6

// How many NeoPixels are attached to the Arduino?
#define LED_COUNT 4

// Declare our NeoPixel strip object:
Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);
// Argument 1 = Number of pixels in NeoPixel strip
// Argument 2 = Arduino pin number (most are valid)
// Argument 3 = Pixel type flags, add together as needed:
//   NEO_KHZ800  800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
//   NEO_KHZ400  400 KHz (classic 'v1' (not v2) FLORA pixels, WS2811 drivers)
//   NEO_GRB     Pixels are wired for GRB bitstream (most NeoPixel products)
//   NEO_RGB     Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
//   NEO_RGBW    Pixels are wired for RGBW bitstream (NeoPixel RGBW products)


// setup() function -- runs once at startup --------------------------------
const int indexSwitch = 2; //define the normally opened switch activated by the index finger in the PIN 2
const int thumbSwitch = 3; //define the normally opened switch activated by the thumb finger in the PIN 3






void setup() {
  Serial.begin(9600);

 pinMode(indexSwitch, INPUT); //define the PIN as an INPUT
  pinMode(thumbSwitch, INPUT); //define the PIN as an INPUT

  // These lines are specifically to support the Adafruit Trinket 5V 16 MHz.
  // Any other board, you can remove this part (but no harm leaving it):
#if defined(__AVR_ATtiny85__) && (F_CPU == 16000000)
  clock_prescale_set(clock_div_1);
#endif
  // END of Trinket-specific code.

  strip.begin();           // INITIALIZE NeoPixel strip object (REQUIRED)
  strip.show();            // Turn OFF all pixels ASAP
  strip.setBrightness(20); // Set BRIGHTNESS to about 1/5 (max = 255)


  
//  // attempt to start the IMU:
  if (!IMU.begin()) {
    Serial.println("Failed to initialize IMU");
    // stop here if you can't access the IMU:
    while (true);
  }
  // start the filter to run at the sample rate:
  filter.begin(sensorRate);
}


void loop() {
  // values for acceleration & rotation:
  float xAcc, yAcc, zAcc;
  float xGyro, yGyro, zGyro;

  // check if the IMU is ready to read:
  if (IMU.accelerationAvailable() &&
      IMU.gyroscopeAvailable()) {
    // read accelerometer & gyrometer:
    IMU.readAcceleration(xAcc, yAcc, zAcc);
    IMU.readGyroscope(xGyro, yGyro, zGyro);

    // update the filter, which computes orientation:
    filter.updateIMU(xGyro, yGyro, zGyro, xAcc, yAcc, zAcc);

    // print the heading, pitch and roll
    roll = filter.getRoll();
    pitch = filter.getPitch();
    heading = filter.getYaw();
  }

  // if you get a byte in the serial port,
  // send the latest heading, pitch, and roll:
  if (Serial.available()) {
    char input = Serial.read();   
    Serial.print(heading);
    Serial.print(",");
    Serial.print(pitch);
    Serial.print(",");
    Serial.print(roll);
    Serial.print(",");
  
    if (digitalRead(indexSwitch) == HIGH && digitalRead(thumbSwitch) == HIGH) { //if both switches are pressed
      fire();
      Serial.print(4); //write the number 4 in the serial monitor
  
    } else if (digitalRead(indexSwitch) == HIGH) { //if the index finger button is pressed
      tree();
     Serial.print(2); //write the number two in the serial monitor
  
    } else if (digitalRead(thumbSwitch) == HIGH) {
      bird();
      Serial.print(3); //write the number 3 in the serial monitor
    } else {
      blink();//CHANGE FOR RAINBOW EFFECT;
      Serial.print(0); //write zero in the serial monitor
    }
    Serial.println();
  }
  //delay(10);
}

void blink(){
  uint32_t white = strip.Color(255, 255, 255);
  strip.fill(white);
  strip.show(); 
  
}

void tree() {
 uint32_t green = strip.Color(0, 200, 0);
  strip.fill(green);
  strip.show(); 

}

void bird() {

  uint32_t blue = strip.Color(0, 0, 200);
  strip.fill(blue);
  strip.show(); 
  }



void fire() {
 uint32_t blue = strip.Color(200, 0, 0);
  strip.fill(blue);
  strip.show(); 
  }

  
void setPixelColorByRange(int start, int _end, int r, int g, int b) {
  while (start < _end) {
    if (waitTime(&flashTimeMark, 3)) {
      strip.setPixelColor(start, strip.Color(r, g, b));
      
      start++;
    }
  }
  strip.show();
}




void fadePixels(int time, int r, int g, int b) {
  alpha = 128 + 127 * cos(2 * PI / periode * time);
  tintPixels(r, g, b, alpha);
}

void tintPixels(int r, int g, int b, int a) {
  strip.setBrightness(a);
  for (uint16_t i = 0; i < strip.numPixels(); i++) {
    uint32_t c = strip.Color(r, g, b);
    strip.setPixelColor(i, c);
  }
  strip.show();
}

float linearTween (float t, float b, float c, float d) {
  return c * t / d + b;
}

float easeInOutSine (float t, float b, float c, float d) {
  return -c / 2 * (cos(M_PI * t / d) - 1) + b;
}

float easeInBounce (float t, float b, float c, float d) {
  return c - easeOutBounce (d - t, 0, c, d) + b;
}

float easeOutBounce (float t, float b, float c, float d) {
  if ((t /= d) < (1 / 2.75)) {
    return c * (7.5625 * t * t) + b;
  } else if (t < (2 / 2.75)) {
    return c * (7.5625 * (t -= (1.5 / 2.75)) * t + .75) + b;
  } else if (t < (2.5 / 2.75)) {
    return c * (7.5625 * (t -= (2.25 / 2.75)) * t + .9375) + b;
  } else {
    return c * (7.5625 * (t -= (2.625 / 2.75)) * t + .984375) + b;
  }
}

float easeInOutBounce (float t, float b, float c, float d) {
  if (t < d / 2) {
    return easeInBounce (t * 2, 0, c, d) * .5 + b;
  }
  return easeOutBounce (t * 2 - d, 0, c, d) * .5 + c * .5 + b;
}

int waitTime(unsigned long *timeMark, unsigned long timeInterval) {
  unsigned long timeCurrent;
  unsigned long timeElapsed;
  int result = false;
  timeCurrent = millis();
  if (timeCurrent < *timeMark) {
    timeElapsed = (TIMECTL_MAXTICKS - *timeMark) + timeCurrent;
  } else {
    timeElapsed = timeCurrent - *timeMark;
  }
  if (timeElapsed >= timeInterval) {
    *timeMark = timeCurrent;
    result = true;
  }
  return (result);
}

For more details about the code and the serial communication, click here.

User Interaction

User interaction indicating the possibility of creating a game with the device

After showing the project to different people who could try the glove, I understood that it has an interesting potential to be a cross-media narrative evolving performance and game.

The connections of the LED’s all broke during the Winter Show, which indicates the need for rethinking how to include light interactive in the gloves.

animation – Augmented Reality

The Flying XRiver

For this assignment, we were tasked to create an augmented reality experiment, focused on the concept rather than a final result. I worked together with Maxwell da Silve and so we created the Flying XRiver – a 3D Sculpture reflecting on concepts such as the Anthropocene and the Flying Rivers of the Amazon Rainforest

Base Concepts

Anthropocene

The current geological age viewed as the period during which human activity has been the dominant influence on climate and the environment.

Geological ages of the Earth
The Flying Rivers
Amazon’s Rainforest Flying Rivers

The flying rivers are a movement of large quantities of water vapor transported in the atmosphere from the Amazon Basin to other parts of South America. The forest trees release water vapor into the atmosphere through transpiration and this moisture is deposited in other localities in the form of precipitation, forming a virtual river. This movement is essential for climate regulation and humidity balance in several parts of the world. Unfortunately, deforestation is increasing the number of fires in the Amazon Rainforest, which is eventually making the flying rivers transport toxic smoke, especially to the southeast of Brazil.

Process of creation

First, we found a 3D hand model at Free3D.com and downloaded it. Later, we imported it to Photoshop as a 3D object and used the brush tool to draw the river on the palm of the hand and so we exported it as a texture/material as seen below.

Texture/Material for the hands. The blue lines were created on Photoshop.

The next step was putting together the hands in Unity in the shape of a river and then adding the river and the smoke layers.

Connecting the hands in Unity

Results

Ar Sculpture

As a result, we have an AR sculpture that can be installed on flat surfaces such as any floor, tables, or other ceilings and immerse yourself in the Flying XRiver. The hands create a metaphor for our responsibility when touching natural resources. At the same time, we have the river on our hands, our hands are floating on the river, so the way we interfere in its dynamics affects the way we move, behave and relate to other organisms.

The hands can also be seen as a prototype for future interactions with the piece, in which we can explore social AR experiments, where people will be able to manipulate the position of the hands generating new collective structures, creating new landscapes and objects in the theme of forests.

ICM Final

The first sound of the Fire Bird’s Eye

For my final assignment for Introduction to Computational Media, I’ve added a sound layer to my gloves designed for my Introduction to Physical Computing Classes. The gloves are being thought for a performing arts environment, where I’m going to embody this character called Fire Bird to tell a story. They have 2 switches that are activated when touching the index fingers or the thumbs of both hands, producing different LED effects that change according to different combinations of the switches.

Design of the Gloves made with reflective fabric

For the ICM part of it, I’m connecting my Arduino using p5.serialcontrol to create the serial communication with p5.js online editor. The built-in IMU (Inertial Movement Unit composed by an accelerometer and a gyroscope sensor) of the Arduino IOT 33 is collecting three different movement data from the gloves: roll, pitch and heading (see image below to visualize the difference between them), which are the spinning movements that a 3D body can make on the space. And sending them to my p5 sketch with a range of 0 to 360.

Data collected by the accelerometer/gyroscope of the Arduino. Yaw can also be called Heading.

This range is being mapped to change the frequency and the amplitude of the carrier wave in a Frequency Modulation system in its simplest form and generate different sounds in real-time. It’s also moving a 3D object that I designed in p5.js. Initially, it was supposed to look like an eye, but now I feel that it became a spaceship.

For the sound, we basically have two oscillators. One is the carrier, and the other is the modulator. The carrier has a Base Frequency that is modified by the modulator, which operates in a range of amplitude and frequency chosen by me.

Find my code here. To create it, I basically adapted and combined two codes. The first one, used to create serial communication with the Arduino IMU was created by Tom Igoe and can be found here. The second is a sound frequency modulation example found in the p5js examples library. I really feel like studying deeper Frequency Modulation and also understanding better how Matrixes work in coding.

Still of my sketch/animation

The Fire Bird

Body experimentation wearing the interactive gloves. Video by Tundi Szász.

The Fire Bird is a first experiment on building characters exploring interactive wearable technologies. This is a work in progress. I’m imagening here a performing arts environment where the performer will be able to control different elements of the space making different gestures.

To compose the Fire Bird so far, I’ve created gloves that generate different light, sound and animation effects according to the hand gestures and movements of the performer that embodies the Fire Bird to tell their story. Basically, while wearing the gloves, the performer can change the direction of a 3D object on the screen and the frequency and the amplitude of its sound. When he/she make gestures related to the narrative of the Fire Bird (Fire, Bird or Tree), different buttons made with conductive fabric sewed to the gloves are activated changing the color of the LED’s of the gloves and the color and size of the seed (3D object) on the screen. The seed is inside the belly of the Fire Bird, so when someone wear the gloves, he/she is embodying the carachter and flying or dreaming carrying the seed.

The LED effects of the glove change according to gestures

Process of Creation

The work was made during Fall 2019 at ITP, manly in three classes. For more details, click on each part of the following description. In Intro to Fabrication, I fabricated the gloves and the acrylic mask of the Fire Bird. For my Introduction to Computational Media final, I created the p5.js code with the 3D object and the frequency modulation sound system coordinated to the movements of the hands. This is made possible after a serial communication with the Inertial Motion Unit built-in the Arduino IOT of the gloves, which code and circuit were developed during the Introduction to Physical Computing classes.

If you would like to know more about the project, please contact me: fernando.gregorio@nyu.edu

The story of the Fire Bird

The story of the Fire Bird is based on three cells of movements. The idea is to develop it further coreographycally, text-based, and possibly in the format of a game.

Movement 1

The Fire Bird flies away from home because their house was put on fire by someone who couldn’t understand the way the Fire Bird was living.

Movement 2

The Fire Bird flies to the city, alone, and finds a place to sleep.

Movement 3

The Fire Bird dreams for so long, that the invisible seed that was inside of their belly grows and turns into a tree.

Questions to keep moving

What’s the seed that is inside of the belly of the Fire Bird made of?

How does the tree that grows from the belly of the Fire Bird look like?

fabrication – week 6

The fire flower eye motor

For the mounting motor’s assignment, I created a simple sculpture inspired by the character I’ve been developing this semester: The Fire Bird. The initial idea was to work with handshapes that spin creating a fire illusional effect. But it turns out that the DC motor that I used stops very easily when in contact with extra weight, so I simplified the idea and used an acrylic eye that I had instead. It still creates a fire-flower effect when spins, which turned out as an interesting aesthetic experiment for me.

Initial Idea

The hardest part was positioning the hole of the enclosure in a way that it forms a perfect angle with the motor, and does not stop because of the friction and spins without shaking the box.

Movement test

Materials

  • Zip ties;
  • 3mm Shaft / Axle;
  • Black ABS Project box;
  • Euro terminal Strip pin;
  • 9V Battery;
  • Hobby DC motor;
  • Switch;
  • Wires.

Final Result

Final Result

fabrication – week 5

two materials assignment

This week we were tasked to work with two different materials. I’ve been venturing myself in the wearable world, so I took the opportunity to learn how to use the sewing machine and understand how to connect wires with conductive thread or fabrics. The goal was to create gloves with LEDs that change their color effects depending on the gesture you make when wearing them.

Materials:

  • Conductive Fabric;
  • Conductive Thread;
  • Perf board;
  • Silicon wires;
  • Conductive tape;
  • Thread;
  • Reflective Fabric;
  • Stretch Grey Fabric;
  • Neopixels;

Pcomp Final Project

The fire bird light and sound glove

Glove’s palm diagram
Back of the hand diagram and details for the finger’s buttons

storytelling bases for coreography

Movement 1: The fire bird flies to the city.

Movement 2: The fire bird finds a place to die and dies.

Movement 3: The invisible seed that was inside of the belly of the fire bird geminates.

Movement 4: Zoom in the seed and enter it. What is the seed made of?

Movement 5: The seed turns into a tree. How does this tree look like?

Tests/Visuals

Materials

  • Arduino Nano 33 IOT;
  • Portable Battery Pack;
  • Battery for Arduino (Lithium?!);
  • Micro USB – USB Cable;
  • Portable speakers with Bluetooth connection;
  • Gloves;
    • Conductive Fabric;
    • Conductive Thread;
    • Reflective Fabric;
  • Silicon wires;
  • Computer with p5.js;

Next Steps

  • Finish Code;
    • Create each LED effect;
      • Initial;
      • Fire;
      • Bird;
      • Tree;
    • Create sound effects in P5js to produce the sounds connected to accelerometer and gyroscope and in different versions for each LED effect;
  • Build the gloves;
    • Draw the gloves in the fabric;
    • Sew Neopixels to the glove;
    • Build an enclosure to wear the arduino nano on my hand;
  • Work on portable version;
    • Create bluetooth connection;
    • Find the best battery to power it;

animation – after effects

Meet.me

Final Result

Meet.me was created by me together with Wen Chen. We started imagining a future when we’ll be able to built ourselves and design body parts in the real world with advanced augmented reality technologies, before going to a casual date. We also talked about how meeting apps change our behavior in society and modify our subjectivity, especially as gay people, who are a big target audience for this industry.

Process

References

fabrication – week 4

Wearable acrylic enclosure

This week we were tasked to create an enclosure. I made a simple experiment for a wearable using acrylic cut in the laser cutter and standoffs. The idea is to wear the Arduino Nano 33 IOT, which has a built-in accelerometer and gyroscope, to collect movement data from my hands, which are going to be used to create sound and visual effects with a computer.

fabrication – week 3

Laser Cutter hand-eye mask

This week we were tasked to work with the laser cutter. I’m always thinking about the performing arts context, so I designed a mask for the character I’ve been developing this semester: the Fire Bird.

I bought two pieces of acrylic at Canal Plastic orange and teal transparent. The most challenging part of this assignment for me was working with Illustrator for the first time….the laser cutter is a big discovery for myself and I’m pretty happy with the mask design – got amazing feedbacks as well. I want to improve the design – make it more reliable and comfortable for the face and maybe develop a series of masks/glasses.