Electronic Rituals, Oracles_Meditation #1

Tempo Temple

For this meditation, I am activating a perspective where I see dancing and drawing as a way of thinking and as a way of producing and embodying knowledge.

One of the results points to the possibility of drawing masks

Tempo temple is the first prototype for a video-performance where I try to draw a temple inside a temple which is inside another temple which is inside another temple and so on…

Tempo meaning: the rate or speed of motion or activity; pace.

Time.

if

the colonizer neoliberal capitalistic project is enchanting our times in order to build a giant world-wide magic circle where there is the illusion of space for just one temple,

Is it possible to play a game inside that circle that builds a temple inside a

a temple, a white temple, an invisible temple, the white snake temple, a dancing and drawing temple that

creates a temporary space where an encounter makes our perception of time enough modified, so that we can see it from outside?

is it possible to do it alone? am I alone when doing it?

when drawing a line with a random dance inside a machine learning temple, which forces are part of the code? What am I embodying? What am I touching in considering all its layers of creation and intention?

and which forces would never be into this temple? If the temple were empty, what would be inside of it?

Random dance resulting in a tree

The video was created interacting with a webcam connected to p5.js using PoseNet. It was based in an example by Daniel Shiffman called “PoseNet example using p5.js to draw with your nose”. I basically changed the nose for both wrists and draw lines instead of circles. Find the code here.

For the future of the project, I’d like to try to draw a line that disappears after some time; think about a better setting, specially for the background; and find a way of creating it in high resolution.

Soft Robotics – Cable Experiments, Morph, Materials research

Cable Experiments

Straw, thread and beads
curtain experiment
multiple straws spider experiment

References

Here is a list of interesting references that I found during my research this week with examples of projects in the field of soft robotics.

Artist Onyx Ashanti

Soft opportunities in a Toy Store

For my soft materials research, I went to FAO Schwarz at the Rockefeller center, because I thought that there I would find the biggest variety of toys. In general, I felt disappointed with the diversity of soft materials in the store in general. Most of them could be found in teddy bears made of the same materials. Unfortunately, I could not find any inflatable or pneumatic system, or even floating objects, in which I’m interested in.

List of materials that I found:

  • Polyester Fiber was the most common material in the store, for sure, found in teddy bears, dolls clothes, etc. It feels really comfortable and cozy to touch it, so it can be used in projects with human or pets interaction which needs to generate those feelings.
  • Polyurethane Foam, found in a sponge. It’s no that soft, but can be used in projects that need a squishy compression effect. It also can float on the water which can interesting.
  • Gummy made of a mixture of syrups, water, gelatin, and other substances. It’s interesting to think about the possibility of working with eatables in the universe of robots, like moving food.
  • Magic snow? – I really don’t know what that was made of and it was not written on the package. It felt really good to touch it, but it would be hard to create anything from it because it has not enough consistency to build shapes and forms.
  • *The deer horn is made of a polymer that I don’t know the name as well. It was not really soft, but it had some movement when you press it.
On my way back home, I found softness being used in publicity referring it as a texture and a human behavior

Nature of Code – Random Walks

random river-drawing program

As the first assignment for the Nature of Code class, I explored Random Walks. Ideally, I wanted to create a 3D Random Walk visualization similar to this one. However, I couldn’t find the problem with my code, even working together with my fellows from the Coding Lab. I really intend to focus on creating 3D graphics in this class, although I feel I still need to continue learning the basics of coding. Thus, I’m figuring out the best strategies to mix both intentions.

Having failed in the 3D attempt, I went back to my first experiments with 2D random walks and tried to develop it further to find ways of creating a drawing program with interesting aesthetics results. I first understood that creating random walks that start on the top of the canvas and finish on the bottom could create a visual effect that reminds me of the movement of the water touching the sand at the beach. I also realized a way of writing the code drawing lines instead of points, to create a more continuous effect. The results are pretty abstract – which interests me. My first version of this sketch can be found here.

Later, I started playing with colors using the same variables as I had used to draw the line of the random walk. This made me realize that I wanted to have more control over the color and the shape of the line while it was being drawn.

To do so, I decided to work with 4 different sliders, so I can change the Hue and the Brightness of the lines while they are being drawn, as well as how much I want the line variate in the X and Y axes. This final code can be found here. And some results can be seen below. I tried to meditate about contemporary ecology issues as a conceptual background while playing with the drawing program. Two main events came into my mind: recent oil spill in the northeast of the Brazilian coast and the fires in the Amazon Rainforest.

Final Pcomp Project

For my final Physical Computing assignment, I’ve created gloves that generate different light, sound and animation effects according to the gestures of the performer that embodies the Fire Bird to tell their story. For more details about the storytelling part of the project, click here.

Video by Tundi Szasz
Different LED effects coordinate to hand gestures

Find below my Arduino Code:

//The Fire Bird

//This code was created during Fall 2019 at ITP NYU by Fernando Gregório


//p5.js code: https://editor.p5js.org/fernandogregor-io/sketches/2wKt5wr64
//Main sources used to create the code:
//For the Serial Communication with the Arduino IMU:
//https://itp.nyu.edu/physcomp/labs/lab-serial-imu-output-to-p5-js/
//For the Sound Frequency Modulation 
//https://p5js.org/examples/sound-frequency-modulation.html
//For the LED's: Maxwell da Silva (link github)
//Coding assistance: Maxwell da Silva


#include <Arduino_LSM6DS3.h>
#include <MadgwickAHRS.h>

// initialize a Madgwick filter:
Madgwick filter;
// sensor's sample rate is fixed at 104 Hz:
const float sensorRate = 104.00;

// values for orientation:
float roll = 0.0;
float pitch = 0.0;
float heading = 0.0;



// A basic everyday NeoPixel strip test program.

// NEOPIXEL BEST PRACTICES for most reliable operation:
// - Add 1000 uF CAPACITOR between NeoPixel strip's + and - connections.
// - MINIMIZE WIRING LENGTH between microcontroller board and first pixel.
// - NeoPixel strip's DATA-IN should pass through a 300-500 OHM RESISTOR.
// - AVOID connecting NeoPixels on a LIVE CIRCUIT. If you must, ALWAYS
//   connect GROUND (-) first, then +, then data.
// - When using a 3.3V microcontroller with a 5V-powered NeoPixel strip,
//   a LOGIC-LEVEL CONVERTER on the data line is STRONGLY RECOMMENDED.
// (Skipping these may work OK on your workbench but can fail in the field)
#define TIMECTL_MAXTICKS  4294967295L
#define TIMECTL_INIT      0
long time;
unsigned long flashTimeMark = 0;
unsigned long flashTimeMark2 = 0;
long interval = 2000;
int periode = 2000;
long previousMillis = 0;
int alpha;
#include <Adafruit_NeoPixel.h>
#ifdef __AVR__
#include <avr/power.h> // Required for 16 MHz Adafruit Trinket
#endif

// Which pin on the Arduino is connected to the NeoPixels?
// On a Trinket or Gemma we suggest changing this to 1:
#define LED_PIN    6

// How many NeoPixels are attached to the Arduino?
#define LED_COUNT 4

// Declare our NeoPixel strip object:
Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);
// Argument 1 = Number of pixels in NeoPixel strip
// Argument 2 = Arduino pin number (most are valid)
// Argument 3 = Pixel type flags, add together as needed:
//   NEO_KHZ800  800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
//   NEO_KHZ400  400 KHz (classic 'v1' (not v2) FLORA pixels, WS2811 drivers)
//   NEO_GRB     Pixels are wired for GRB bitstream (most NeoPixel products)
//   NEO_RGB     Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
//   NEO_RGBW    Pixels are wired for RGBW bitstream (NeoPixel RGBW products)


// setup() function -- runs once at startup --------------------------------
const int indexSwitch = 2; //define the normally opened switch activated by the index finger in the PIN 2
const int thumbSwitch = 3; //define the normally opened switch activated by the thumb finger in the PIN 3






void setup() {
  Serial.begin(9600);

 pinMode(indexSwitch, INPUT); //define the PIN as an INPUT
  pinMode(thumbSwitch, INPUT); //define the PIN as an INPUT

  // These lines are specifically to support the Adafruit Trinket 5V 16 MHz.
  // Any other board, you can remove this part (but no harm leaving it):
#if defined(__AVR_ATtiny85__) && (F_CPU == 16000000)
  clock_prescale_set(clock_div_1);
#endif
  // END of Trinket-specific code.

  strip.begin();           // INITIALIZE NeoPixel strip object (REQUIRED)
  strip.show();            // Turn OFF all pixels ASAP
  strip.setBrightness(20); // Set BRIGHTNESS to about 1/5 (max = 255)


  
//  // attempt to start the IMU:
  if (!IMU.begin()) {
    Serial.println("Failed to initialize IMU");
    // stop here if you can't access the IMU:
    while (true);
  }
  // start the filter to run at the sample rate:
  filter.begin(sensorRate);
}


void loop() {
  // values for acceleration & rotation:
  float xAcc, yAcc, zAcc;
  float xGyro, yGyro, zGyro;

  // check if the IMU is ready to read:
  if (IMU.accelerationAvailable() &&
      IMU.gyroscopeAvailable()) {
    // read accelerometer & gyrometer:
    IMU.readAcceleration(xAcc, yAcc, zAcc);
    IMU.readGyroscope(xGyro, yGyro, zGyro);

    // update the filter, which computes orientation:
    filter.updateIMU(xGyro, yGyro, zGyro, xAcc, yAcc, zAcc);

    // print the heading, pitch and roll
    roll = filter.getRoll();
    pitch = filter.getPitch();
    heading = filter.getYaw();
  }

  // if you get a byte in the serial port,
  // send the latest heading, pitch, and roll:
  if (Serial.available()) {
    char input = Serial.read();   
    Serial.print(heading);
    Serial.print(",");
    Serial.print(pitch);
    Serial.print(",");
    Serial.print(roll);
    Serial.print(",");
  
    if (digitalRead(indexSwitch) == HIGH && digitalRead(thumbSwitch) == HIGH) { //if both switches are pressed
      fire();
      Serial.print(4); //write the number 4 in the serial monitor
  
    } else if (digitalRead(indexSwitch) == HIGH) { //if the index finger button is pressed
      tree();
     Serial.print(2); //write the number two in the serial monitor
  
    } else if (digitalRead(thumbSwitch) == HIGH) {
      bird();
      Serial.print(3); //write the number 3 in the serial monitor
    } else {
      blink();//CHANGE FOR RAINBOW EFFECT;
      Serial.print(0); //write zero in the serial monitor
    }
    Serial.println();
  }
  //delay(10);
}

void blink(){
  uint32_t white = strip.Color(255, 255, 255);
  strip.fill(white);
  strip.show(); 
  
}

void tree() {
 uint32_t green = strip.Color(0, 200, 0);
  strip.fill(green);
  strip.show(); 

}

void bird() {

  uint32_t blue = strip.Color(0, 0, 200);
  strip.fill(blue);
  strip.show(); 
  }



void fire() {
 uint32_t blue = strip.Color(200, 0, 0);
  strip.fill(blue);
  strip.show(); 
  }

  
void setPixelColorByRange(int start, int _end, int r, int g, int b) {
  while (start < _end) {
    if (waitTime(&flashTimeMark, 3)) {
      strip.setPixelColor(start, strip.Color(r, g, b));
      
      start++;
    }
  }
  strip.show();
}




void fadePixels(int time, int r, int g, int b) {
  alpha = 128 + 127 * cos(2 * PI / periode * time);
  tintPixels(r, g, b, alpha);
}

void tintPixels(int r, int g, int b, int a) {
  strip.setBrightness(a);
  for (uint16_t i = 0; i < strip.numPixels(); i++) {
    uint32_t c = strip.Color(r, g, b);
    strip.setPixelColor(i, c);
  }
  strip.show();
}

float linearTween (float t, float b, float c, float d) {
  return c * t / d + b;
}

float easeInOutSine (float t, float b, float c, float d) {
  return -c / 2 * (cos(M_PI * t / d) - 1) + b;
}

float easeInBounce (float t, float b, float c, float d) {
  return c - easeOutBounce (d - t, 0, c, d) + b;
}

float easeOutBounce (float t, float b, float c, float d) {
  if ((t /= d) < (1 / 2.75)) {
    return c * (7.5625 * t * t) + b;
  } else if (t < (2 / 2.75)) {
    return c * (7.5625 * (t -= (1.5 / 2.75)) * t + .75) + b;
  } else if (t < (2.5 / 2.75)) {
    return c * (7.5625 * (t -= (2.25 / 2.75)) * t + .9375) + b;
  } else {
    return c * (7.5625 * (t -= (2.625 / 2.75)) * t + .984375) + b;
  }
}

float easeInOutBounce (float t, float b, float c, float d) {
  if (t < d / 2) {
    return easeInBounce (t * 2, 0, c, d) * .5 + b;
  }
  return easeOutBounce (t * 2 - d, 0, c, d) * .5 + c * .5 + b;
}

int waitTime(unsigned long *timeMark, unsigned long timeInterval) {
  unsigned long timeCurrent;
  unsigned long timeElapsed;
  int result = false;
  timeCurrent = millis();
  if (timeCurrent < *timeMark) {
    timeElapsed = (TIMECTL_MAXTICKS - *timeMark) + timeCurrent;
  } else {
    timeElapsed = timeCurrent - *timeMark;
  }
  if (timeElapsed >= timeInterval) {
    *timeMark = timeCurrent;
    result = true;
  }
  return (result);
}

For more details about the code and the serial communication, click here.

User Interaction

User interaction indicating the possibility of creating a game with the device

After showing the project to different people who could try the glove, I understood that it has an interesting potential to be a cross-media narrative evolving performance and game.

The connections of the LED’s all broke during the Winter Show, which indicates the need for rethinking how to include light interactive in the gloves.

animation – Augmented Reality

The Flying XRiver

For this assignment, we were tasked to create an augmented reality experiment, focused on the concept rather than a final result. I worked together with Maxwell da Silve and so we created the Flying XRiver – a 3D Sculpture reflecting on concepts such as the Anthropocene and the Flying Rivers of the Amazon Rainforest

Base Concepts

Anthropocene

The current geological age viewed as the period during which human activity has been the dominant influence on climate and the environment.

Geological ages of the Earth
The Flying Rivers
Amazon’s Rainforest Flying Rivers

The flying rivers are a movement of large quantities of water vapor transported in the atmosphere from the Amazon Basin to other parts of South America. The forest trees release water vapor into the atmosphere through transpiration and this moisture is deposited in other localities in the form of precipitation, forming a virtual river. This movement is essential for climate regulation and humidity balance in several parts of the world. Unfortunately, deforestation is increasing the number of fires in the Amazon Rainforest, which is eventually making the flying rivers transport toxic smoke, especially to the southeast of Brazil.

Process of creation

First, we found a 3D hand model at Free3D.com and downloaded it. Later, we imported it to Photoshop as a 3D object and used the brush tool to draw the river on the palm of the hand and so we exported it as a texture/material as seen below.

Texture/Material for the hands. The blue lines were created on Photoshop.

The next step was putting together the hands in Unity in the shape of a river and then adding the river and the smoke layers.

Connecting the hands in Unity

Results

Ar Sculpture

As a result, we have an AR sculpture that can be installed on flat surfaces such as any floor, tables, or other ceilings and immerse yourself in the Flying XRiver. The hands create a metaphor for our responsibility when touching natural resources. At the same time, we have the river on our hands, our hands are floating on the river, so the way we interfere in its dynamics affects the way we move, behave and relate to other organisms.

The hands can also be seen as a prototype for future interactions with the piece, in which we can explore social AR experiments, where people will be able to manipulate the position of the hands generating new collective structures, creating new landscapes and objects in the theme of forests.

ICM Final

The first sound of the Fire Bird’s Eye

For my final assignment for Introduction to Computational Media, I’ve added a sound layer to my gloves designed for my Introduction to Physical Computing Classes. The gloves are being thought for a performing arts environment, where I’m going to embody this character called Fire Bird to tell a story. They have 2 switches that are activated when touching the index fingers or the thumbs of both hands, producing different LED effects that change according to different combinations of the switches.

Design of the Gloves made with reflective fabric

For the ICM part of it, I’m connecting my Arduino using p5.serialcontrol to create the serial communication with p5.js online editor. The built-in IMU (Inertial Movement Unit composed by an accelerometer and a gyroscope sensor) of the Arduino IOT 33 is collecting three different movement data from the gloves: roll, pitch and heading (see image below to visualize the difference between them), which are the spinning movements that a 3D body can make on the space. And sending them to my p5 sketch with a range of 0 to 360.

Data collected by the accelerometer/gyroscope of the Arduino. Yaw can also be called Heading.

This range is being mapped to change the frequency and the amplitude of the carrier wave in a Frequency Modulation system in its simplest form and generate different sounds in real-time. It’s also moving a 3D object that I designed in p5.js. Initially, it was supposed to look like an eye, but now I feel that it became a spaceship.

For the sound, we basically have two oscillators. One is the carrier, and the other is the modulator. The carrier has a Base Frequency that is modified by the modulator, which operates in a range of amplitude and frequency chosen by me.

Find my code here. To create it, I basically adapted and combined two codes. The first one, used to create serial communication with the Arduino IMU was created by Tom Igoe and can be found here. The second is a sound frequency modulation example found in the p5js examples library. I really feel like studying deeper Frequency Modulation and also understanding better how Matrixes work in coding.

Still of my sketch/animation

The Fire Bird

Body experimentation wearing the interactive gloves. Video by Tundi Szász.

The Fire Bird is a first experiment on building characters exploring interactive wearable technologies. This is a work in progress. I’m imagening here a performing arts environment where the performer will be able to control different elements of the space making different gestures.

To compose the Fire Bird so far, I’ve created gloves that generate different light, sound and animation effects according to the hand gestures and movements of the performer that embodies the Fire Bird to tell their story. Basically, while wearing the gloves, the performer can change the direction of a 3D object on the screen and the frequency and the amplitude of its sound. When he/she make gestures related to the narrative of the Fire Bird (Fire, Bird or Tree), different buttons made with conductive fabric sewed to the gloves are activated changing the color of the LED’s of the gloves and the color and size of the seed (3D object) on the screen. The seed is inside the belly of the Fire Bird, so when someone wear the gloves, he/she is embodying the carachter and flying or dreaming carrying the seed.

The LED effects of the glove change according to gestures

Process of Creation

The work was made during Fall 2019 at ITP, manly in three classes. For more details, click on each part of the following description. In Intro to Fabrication, I fabricated the gloves and the acrylic mask of the Fire Bird. For my Introduction to Computational Media final, I created the p5.js code with the 3D object and the frequency modulation sound system coordinated to the movements of the hands. This is made possible after a serial communication with the Inertial Motion Unit built-in the Arduino IOT of the gloves, which code and circuit were developed during the Introduction to Physical Computing classes.

If you would like to know more about the project, please contact me: fernando.gregorio@nyu.edu

The story of the Fire Bird

The story of the Fire Bird is based on three cells of movements. The idea is to develop it further coreographycally, text-based, and possibly in the format of a game.

Movement 1

The Fire Bird flies away from home because their house was put on fire by someone who couldn’t understand the way the Fire Bird was living.

Movement 2

The Fire Bird flies to the city, alone, and finds a place to sleep.

Movement 3

The Fire Bird dreams for so long, that the invisible seed that was inside of their belly grows and turns into a tree.

Questions to keep moving

What’s the seed that is inside of the belly of the Fire Bird made of?

How does the tree that grows from the belly of the Fire Bird look like?

fabrication – week 6

The fire flower eye motor

For the mounting motor’s assignment, I created a simple sculpture inspired by the character I’ve been developing this semester: The Fire Bird. The initial idea was to work with handshapes that spin creating a fire illusional effect. But it turns out that the DC motor that I used stops very easily when in contact with extra weight, so I simplified the idea and used an acrylic eye that I had instead. It still creates a fire-flower effect when spins, which turned out as an interesting aesthetic experiment for me.

Initial Idea

The hardest part was positioning the hole of the enclosure in a way that it forms a perfect angle with the motor, and does not stop because of the friction and spins without shaking the box.

Movement test

Materials

  • Zip ties;
  • 3mm Shaft / Axle;
  • Black ABS Project box;
  • Euro terminal Strip pin;
  • 9V Battery;
  • Hobby DC motor;
  • Switch;
  • Wires.

Final Result

Final Result

fabrication – week 5

two materials assignment

This week we were tasked to work with two different materials. I’ve been venturing myself in the wearable world, so I took the opportunity to learn how to use the sewing machine and understand how to connect wires with conductive thread or fabrics. The goal was to create gloves with LEDs that change their color effects depending on the gesture you make when wearing them.

Materials:

  • Conductive Fabric;
  • Conductive Thread;
  • Perf board;
  • Silicon wires;
  • Conductive tape;
  • Thread;
  • Reflective Fabric;
  • Stretch Grey Fabric;
  • Neopixels;