Nature of Code – Simulation Project

The Web Fire Bird
The Story of the Fire Bird

The Fire Bird is a mythology that I started creating during my first semester at ITP, firstly, as a a physical computing project. Now I’m creating a net-art version of the narrative. The story of the Fire Bird is divided in three base movements:

1 – The Fire Bird flies away from home because their house was put on fire by someone who couldn’t understand the way the Fire Bird was living;

2- The Fire Bird flies to the city, alone, carrying a fire seed inside their belly and finds a place to sleep;

3- The Fire Bird dreams for so long, that the invisible seed that was inside of their belly grows and turns into a tree.

For this sketch, I’m focusing on the part of the narrative in which they fly away from home to the city.

References

The story of the Fire Bird is based mainly on the environmental disasters caused by fires in the Amazon Rainforest in 2019 with the aim of keeping its memory and constantly question all the actions that lead the grown of the fires in that year, such as the considerable growth of deforestation. I’m interested here in how other species are affected by human culture and it’s catastrophes during the Anthropocene.

This work is also strongly influenced by things I learned working together with indigenous communities in Brazil. Especially the philosophy of the indigenous Guarani People, which tells that words have soul. That’s why their word for “throat” is ahy’o, but also ñe’e raity, which literally means “nest of the word-soul”. These indigenous people are mostly based in the southeast region of Brazil and some areas in Paraguay and are constantly struggling to preserve their culture and the territory they have left.

Based on the Guarani idea of the throat as a nest for words-soul, the Brazilian psychoanalyst and philosopher Suely Rolnik, which whom I had classes in the past, describes a “familiar-strange” body-mind state in which we find ourselves sometimes in life, especially when we are about to germinate a seed of a “new world” a “world-to-come”, after untieing the knots that we carry in our nest-throats. She explains how the creation of these new worlds has its own time and how the nests need to be taken care of while the colonial-capitalistic system tries to kill this process.

Screenshot of the net art work

I’m understanding that the Web Fire Bird is a simulation of this concept and psychological state as well, as a visual poem or a digital metaphor. After being affected by a colonial trauma, the bird flies away from home carrying the seed of new worlds in their throat. That’s why their landscape is composed of words that are constantly changing their combination in order to tell their own story. The landscape is under transformation and construction, as the course of a river, trying to stay alive while organizing itself constantly in search of a configuration – or a place to germinate.

Code

Find the p5.js sketch editable here and the present mode here.

The code was created based on examples by Daniel Shiffman in the context of the Nature of Code classes and with his help during office hours. I basically mixed and worked changing details and on the design of three examples: “The Vehicles class” seeking a target, “Particle Systems” with gravity applied to it and “Flow Fields” (changing every time the mouse is clicked).

Result

The result is my first net art piece! : ) It is an interactive generative poem.

During its process of creation, I was invited by the curator Guilherme Brandão to be part of the online residency Jururu Club, which is one of the Brazilian pavilions of the fourth edition of The Wrong Biennale – “a global online & offline event aiming to nurture digital culture by engaging with a vast selection of curated artworks”.

For Jururu Club, I created a Portuguese version of it.

Thanks for Dan for all the help and Gabriel for giving me a web-space to share my work : )

For future versions of this work, I plan to enrich the interactions and walk towards a more gaming atmosphere.

Choreographic Interventions: Circular Pathways

This assignment was made in partnership with Sarah Yasmine, from Columbia University. We created a 90 seconds choreography interacting with circular pathways coded in p5js to be projected on the floor while we perform. Find the code here.

Choreography design

Nature of Code – Particles Systems

The web Fire Bird

For this assignment, I’ve started creating a web version for the Fire Bird mythology, that I’ve been working on since last semester. I want to find an interesting and compelling way of telling the story in the web-browser using text, visual effects, and simple interactivity. I’m creating this as an experimental art piece for the browser, but it can also be seen as a first step towards designing a game based on this character’s story. Find the code here.

The Triangles represent fire seeds that are thrown by the Fire Bird

The story of the Fire Bird is based on recent environmental disasters caused by fires in the Amazon Rainforest and has three movements: the Fire Bird flies away from home because their house was put on fire by someone who couldn’t understand the way the Fire Bird was living; the Fire Bird flies to the city, alone, carrying a fire seed inside their belly and finds a place to sleep; the Fire Bird dreams for so long, that the invisible seed that was inside of their belly grows and turns into a tree. For this sketch, I’m focusing on the part of the narrative in which they fly away from home to the city.

The Fire Bird is moved by the mouse position in the canvas

The code has an image that moves and changes its direction according to the mouse position (the cursor is hidden by the function “noCursor();”) using a “Vehicle Class“. When you click the mouse, a rotating triangle flies away from the center of the image and is affected by a gravity force until it disappears after the bottom of the canvas. The “PartycleSystem” class is being used to push a new particle into an empty array (particles) and remove it after the particle is “dead” as well as applying the force to all particles. The death is caused by a lifespan variable inside the class Particle, which acts as a countdown timer. It has an initial value and when it becomes 0, the particle is dead. Another important point to remember is that I’m connecting the position of the image and the position from where the triangles are thrown with the sentence: “system.addParticle(v.position.x, v.position.y);”. The “addParticle” function is inside the “SystemParticles” class, and its “x” and “y” variables are connected to the position vector of the constructor of the Particle class, which itself is being affected by the vehicle class.

The next step is creating the landscape for the sketch and finding an interesting design to write the story of the Fire Bird as part of it. I’ve been thinking about drawing the landscape with moving text…I have started to try a different behavior fot the seed when it heats the ground (which has to be created) in this code.

Experiments in AR – Case Study

4th Wall AR App

4th wall is defined as “a free, augmented reality (AR) public art platform exploring resistance and inclusive creative expression”. It was created by the mixed media artist Nancy Baker Cahill in partnership with app developer Drive Studio.

Nancy Baker takes her dimensional drawings on paper, and then finish them on VR and ends up translating them into AR and share them through 4th Wall, so the user “can create their own context and content with the works, locating them anywhere in the world”. In this version of the app, there are 4 different versions of her AR artworks and another page called Coordinates, which is a “curatorial and site-specific AR public art project”, where Baker installs geo-located artworks all over the world on public spaces, especially addressing political issues, such as frontiers.

I discovered this app while searching for augmented reality examples in the contemporary art world. Although AR is becoming more common in this context, with big international artists such as Marina Abramovic exploring it, I was expecting to find more cases than it actually happened. In general, I feel that the most common for contemporary artists is to translate to AR a work that could be done in another medium, but not reflect its potentiality as uniqueness in a more metalinguistic approach and critically engaged perspective.

I ended up choosing 4th wall because its creator has a declares politic approach to the project. In Nancy’s words “4th Wall serves as a way to collectively share an augmented experience without expensive or inaccessible VR headsets and technology”. Thus, it breaks the 4th wall of the traditional art space, such as a gallery, and allows everybody that has a cellphone to access its content from anywhere. Indeed, I’ve been thinking a lot about this power on accessing and creating AR with relatively cheap tools, especially after hearing this podcast, in which Zach Lieberman and Molmol Kuo point that as well as an interesting peculiarity of this medium. On the other hand, who will get access to the existence of an app such as the 4th Wall and how? It is accessible, but is the information that it exists enough accessible?

I find interesting that it is built in its own app because it can create a unique user experience that dialogues better with its own goals. If it was on Instagram for instance, more people would have access to it, but the user experience would be limited to the logics of another platform, not necessarily designed with the same intentions. How public or free is Instagram (considering, for example, the amount of compulsory publicity in it and data collected from the users) if compared to the 4th Wall app?

Video found in the project’s website

The experience is artistic and I found interesting experiencing her drawings in different spaces and interacting with them, entering them, kind of dancing the visualization of the drawing. Somehow you embody the body of the artist when visualizing it and can create different compositions with space, which kinds of makes you part of the process of creation. I guess this project is more focused on the art community as a target audience, because it has a more abstract approach, although it provides an experience that can be easily enjoyable for anyone slightly interested in art and tech.

Although the interface of the app is simplistic and does not look well elaborated, it has all the functions it needs and works pretty well – you can move the object with touchscreen, take a picture or make a video. I haven’t tried the geo-located ones, but feel like doing it, and the app kind of makes you feel it with a simple arrow pointing the way to the closest artwork and giving you easy access to a map with the exact location. This technical sophistication of geolocating the works and installing them anywhere in the world really makes me fill excited, although I have heard from people who have worked with it that it is not precise, so it works more for large scales, where you don’t need to be accurate with the location itself.

“Is AR breaking down walls”? The creator of the 4th Wall app answers yes enthusiastically. I agree but not completely. It has great potential, it can be cheap to create and if designed for mobiles, but it guarantees access just for those who have a mobile connected to the internet (which is not a global reality thinking about capitalistic catastrophes of poverty production). It also does not really open access to canonized art spaces – it creates other kinds of spaces, which has another kind of potential. At the same time, it still can interfere in an interesting way inside those spaces.

Electronic Rituals, Oracles_Meditation #1

Tempo Temple

For this meditation, I am activating a perspective where I see dancing and drawing as a way of thinking and as a way of producing and embodying knowledge.

One of the results points to the possibility of drawing masks

Tempo temple is the first prototype for a video-performance where I try to draw a temple inside a temple which is inside another temple which is inside another temple and so on…

Tempo meaning: the rate or speed of motion or activity; pace.

Time.

if

the colonizer neoliberal capitalistic project is enchanting our times in order to build a giant world-wide magic circle where there is the illusion of space for just one temple,

Is it possible to play a game inside that circle that builds a temple inside a

a temple, a white temple, an invisible temple, the white snake temple, a dancing and drawing temple that

creates a temporary space where an encounter makes our perception of time enough modified, so that we can see it from outside?

is it possible to do it alone? am I alone when doing it?

when drawing a line with a random dance inside a machine learning temple, which forces are part of the code? What am I embodying? What am I touching in considering all its layers of creation and intention?

and which forces would never be into this temple? If the temple were empty, what would be inside of it?

Random dance resulting in a tree

The video was created interacting with a webcam connected to p5.js using PoseNet. It was based in an example by Daniel Shiffman called “PoseNet example using p5.js to draw with your nose”. I basically changed the nose for both wrists and draw lines instead of circles. Find the code here.

For the future of the project, I’d like to try to draw a line that disappears after some time; think about a better setting, specially for the background; and find a way of creating it in high resolution.

Soft Robotics – Cable Experiments, Morph, Materials research

Cable Experiments

Straw, thread and beads
curtain experiment
multiple straws spider experiment

References

Here is a list of interesting references that I found during my research this week with examples of projects in the field of soft robotics.

Artist Onyx Ashanti

Soft opportunities in a Toy Store

For my soft materials research, I went to FAO Schwarz at the Rockefeller center, because I thought that there I would find the biggest variety of toys. In general, I felt disappointed with the diversity of soft materials in the store in general. Most of them could be found in teddy bears made of the same materials. Unfortunately, I could not find any inflatable or pneumatic system, or even floating objects, in which I’m interested in.

List of materials that I found:

  • Polyester Fiber was the most common material in the store, for sure, found in teddy bears, dolls clothes, etc. It feels really comfortable and cozy to touch it, so it can be used in projects with human or pets interaction which needs to generate those feelings.
  • Polyurethane Foam, found in a sponge. It’s no that soft, but can be used in projects that need a squishy compression effect. It also can float on the water which can interesting.
  • Gummy made of a mixture of syrups, water, gelatin, and other substances. It’s interesting to think about the possibility of working with eatables in the universe of robots, like moving food.
  • Magic snow? – I really don’t know what that was made of and it was not written on the package. It felt really good to touch it, but it would be hard to create anything from it because it has not enough consistency to build shapes and forms.
  • *The deer horn is made of a polymer that I don’t know the name as well. It was not really soft, but it had some movement when you press it.
On my way back home, I found softness being used in publicity referring it as a texture and a human behavior

Nature of Code – Random Walks

random river-drawing program

As the first assignment for the Nature of Code class, I explored Random Walks. Ideally, I wanted to create a 3D Random Walk visualization similar to this one. However, I couldn’t find the problem with my code, even working together with my fellows from the Coding Lab. I really intend to focus on creating 3D graphics in this class, although I feel I still need to continue learning the basics of coding. Thus, I’m figuring out the best strategies to mix both intentions.

Having failed in the 3D attempt, I went back to my first experiments with 2D random walks and tried to develop it further to find ways of creating a drawing program with interesting aesthetics results. I first understood that creating random walks that start on the top of the canvas and finish on the bottom could create a visual effect that reminds me of the movement of the water touching the sand at the beach. I also realized a way of writing the code drawing lines instead of points, to create a more continuous effect. The results are pretty abstract – which interests me. My first version of this sketch can be found here.

Later, I started playing with colors using the same variables as I had used to draw the line of the random walk. This made me realize that I wanted to have more control over the color and the shape of the line while it was being drawn.

To do so, I decided to work with 4 different sliders, so I can change the Hue and the Brightness of the lines while they are being drawn, as well as how much I want the line variate in the X and Y axes. This final code can be found here. And some results can be seen below. I tried to meditate about contemporary ecology issues as a conceptual background while playing with the drawing program. Two main events came into my mind: recent oil spill in the northeast of the Brazilian coast and the fires in the Amazon Rainforest.

Final Pcomp Project

For my final Physical Computing assignment, I’ve created gloves that generate different light, sound and animation effects according to the gestures of the performer that embodies the Fire Bird to tell their story. For more details about the storytelling part of the project, click here.

Video by Tundi Szasz
Different LED effects coordinate to hand gestures

Find below my Arduino Code:

//The Fire Bird

//This code was created during Fall 2019 at ITP NYU by Fernando Gregório


//p5.js code: https://editor.p5js.org/fernandogregor-io/sketches/2wKt5wr64
//Main sources used to create the code:
//For the Serial Communication with the Arduino IMU:
//https://itp.nyu.edu/physcomp/labs/lab-serial-imu-output-to-p5-js/
//For the Sound Frequency Modulation 
//https://p5js.org/examples/sound-frequency-modulation.html
//For the LED's: Maxwell da Silva (link github)
//Coding assistance: Maxwell da Silva


#include <Arduino_LSM6DS3.h>
#include <MadgwickAHRS.h>

// initialize a Madgwick filter:
Madgwick filter;
// sensor's sample rate is fixed at 104 Hz:
const float sensorRate = 104.00;

// values for orientation:
float roll = 0.0;
float pitch = 0.0;
float heading = 0.0;



// A basic everyday NeoPixel strip test program.

// NEOPIXEL BEST PRACTICES for most reliable operation:
// - Add 1000 uF CAPACITOR between NeoPixel strip's + and - connections.
// - MINIMIZE WIRING LENGTH between microcontroller board and first pixel.
// - NeoPixel strip's DATA-IN should pass through a 300-500 OHM RESISTOR.
// - AVOID connecting NeoPixels on a LIVE CIRCUIT. If you must, ALWAYS
//   connect GROUND (-) first, then +, then data.
// - When using a 3.3V microcontroller with a 5V-powered NeoPixel strip,
//   a LOGIC-LEVEL CONVERTER on the data line is STRONGLY RECOMMENDED.
// (Skipping these may work OK on your workbench but can fail in the field)
#define TIMECTL_MAXTICKS  4294967295L
#define TIMECTL_INIT      0
long time;
unsigned long flashTimeMark = 0;
unsigned long flashTimeMark2 = 0;
long interval = 2000;
int periode = 2000;
long previousMillis = 0;
int alpha;
#include <Adafruit_NeoPixel.h>
#ifdef __AVR__
#include <avr/power.h> // Required for 16 MHz Adafruit Trinket
#endif

// Which pin on the Arduino is connected to the NeoPixels?
// On a Trinket or Gemma we suggest changing this to 1:
#define LED_PIN    6

// How many NeoPixels are attached to the Arduino?
#define LED_COUNT 4

// Declare our NeoPixel strip object:
Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);
// Argument 1 = Number of pixels in NeoPixel strip
// Argument 2 = Arduino pin number (most are valid)
// Argument 3 = Pixel type flags, add together as needed:
//   NEO_KHZ800  800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
//   NEO_KHZ400  400 KHz (classic 'v1' (not v2) FLORA pixels, WS2811 drivers)
//   NEO_GRB     Pixels are wired for GRB bitstream (most NeoPixel products)
//   NEO_RGB     Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
//   NEO_RGBW    Pixels are wired for RGBW bitstream (NeoPixel RGBW products)


// setup() function -- runs once at startup --------------------------------
const int indexSwitch = 2; //define the normally opened switch activated by the index finger in the PIN 2
const int thumbSwitch = 3; //define the normally opened switch activated by the thumb finger in the PIN 3






void setup() {
  Serial.begin(9600);

 pinMode(indexSwitch, INPUT); //define the PIN as an INPUT
  pinMode(thumbSwitch, INPUT); //define the PIN as an INPUT

  // These lines are specifically to support the Adafruit Trinket 5V 16 MHz.
  // Any other board, you can remove this part (but no harm leaving it):
#if defined(__AVR_ATtiny85__) && (F_CPU == 16000000)
  clock_prescale_set(clock_div_1);
#endif
  // END of Trinket-specific code.

  strip.begin();           // INITIALIZE NeoPixel strip object (REQUIRED)
  strip.show();            // Turn OFF all pixels ASAP
  strip.setBrightness(20); // Set BRIGHTNESS to about 1/5 (max = 255)


  
//  // attempt to start the IMU:
  if (!IMU.begin()) {
    Serial.println("Failed to initialize IMU");
    // stop here if you can't access the IMU:
    while (true);
  }
  // start the filter to run at the sample rate:
  filter.begin(sensorRate);
}


void loop() {
  // values for acceleration & rotation:
  float xAcc, yAcc, zAcc;
  float xGyro, yGyro, zGyro;

  // check if the IMU is ready to read:
  if (IMU.accelerationAvailable() &&
      IMU.gyroscopeAvailable()) {
    // read accelerometer & gyrometer:
    IMU.readAcceleration(xAcc, yAcc, zAcc);
    IMU.readGyroscope(xGyro, yGyro, zGyro);

    // update the filter, which computes orientation:
    filter.updateIMU(xGyro, yGyro, zGyro, xAcc, yAcc, zAcc);

    // print the heading, pitch and roll
    roll = filter.getRoll();
    pitch = filter.getPitch();
    heading = filter.getYaw();
  }

  // if you get a byte in the serial port,
  // send the latest heading, pitch, and roll:
  if (Serial.available()) {
    char input = Serial.read();   
    Serial.print(heading);
    Serial.print(",");
    Serial.print(pitch);
    Serial.print(",");
    Serial.print(roll);
    Serial.print(",");
  
    if (digitalRead(indexSwitch) == HIGH && digitalRead(thumbSwitch) == HIGH) { //if both switches are pressed
      fire();
      Serial.print(4); //write the number 4 in the serial monitor
  
    } else if (digitalRead(indexSwitch) == HIGH) { //if the index finger button is pressed
      tree();
     Serial.print(2); //write the number two in the serial monitor
  
    } else if (digitalRead(thumbSwitch) == HIGH) {
      bird();
      Serial.print(3); //write the number 3 in the serial monitor
    } else {
      blink();//CHANGE FOR RAINBOW EFFECT;
      Serial.print(0); //write zero in the serial monitor
    }
    Serial.println();
  }
  //delay(10);
}

void blink(){
  uint32_t white = strip.Color(255, 255, 255);
  strip.fill(white);
  strip.show(); 
  
}

void tree() {
 uint32_t green = strip.Color(0, 200, 0);
  strip.fill(green);
  strip.show(); 

}

void bird() {

  uint32_t blue = strip.Color(0, 0, 200);
  strip.fill(blue);
  strip.show(); 
  }



void fire() {
 uint32_t blue = strip.Color(200, 0, 0);
  strip.fill(blue);
  strip.show(); 
  }

  
void setPixelColorByRange(int start, int _end, int r, int g, int b) {
  while (start < _end) {
    if (waitTime(&flashTimeMark, 3)) {
      strip.setPixelColor(start, strip.Color(r, g, b));
      
      start++;
    }
  }
  strip.show();
}




void fadePixels(int time, int r, int g, int b) {
  alpha = 128 + 127 * cos(2 * PI / periode * time);
  tintPixels(r, g, b, alpha);
}

void tintPixels(int r, int g, int b, int a) {
  strip.setBrightness(a);
  for (uint16_t i = 0; i < strip.numPixels(); i++) {
    uint32_t c = strip.Color(r, g, b);
    strip.setPixelColor(i, c);
  }
  strip.show();
}

float linearTween (float t, float b, float c, float d) {
  return c * t / d + b;
}

float easeInOutSine (float t, float b, float c, float d) {
  return -c / 2 * (cos(M_PI * t / d) - 1) + b;
}

float easeInBounce (float t, float b, float c, float d) {
  return c - easeOutBounce (d - t, 0, c, d) + b;
}

float easeOutBounce (float t, float b, float c, float d) {
  if ((t /= d) < (1 / 2.75)) {
    return c * (7.5625 * t * t) + b;
  } else if (t < (2 / 2.75)) {
    return c * (7.5625 * (t -= (1.5 / 2.75)) * t + .75) + b;
  } else if (t < (2.5 / 2.75)) {
    return c * (7.5625 * (t -= (2.25 / 2.75)) * t + .9375) + b;
  } else {
    return c * (7.5625 * (t -= (2.625 / 2.75)) * t + .984375) + b;
  }
}

float easeInOutBounce (float t, float b, float c, float d) {
  if (t < d / 2) {
    return easeInBounce (t * 2, 0, c, d) * .5 + b;
  }
  return easeOutBounce (t * 2 - d, 0, c, d) * .5 + c * .5 + b;
}

int waitTime(unsigned long *timeMark, unsigned long timeInterval) {
  unsigned long timeCurrent;
  unsigned long timeElapsed;
  int result = false;
  timeCurrent = millis();
  if (timeCurrent < *timeMark) {
    timeElapsed = (TIMECTL_MAXTICKS - *timeMark) + timeCurrent;
  } else {
    timeElapsed = timeCurrent - *timeMark;
  }
  if (timeElapsed >= timeInterval) {
    *timeMark = timeCurrent;
    result = true;
  }
  return (result);
}

For more details about the code and the serial communication, click here.

User Interaction

User interaction indicating the possibility of creating a game with the device

After showing the project to different people who could try the glove, I understood that it has an interesting potential to be a cross-media narrative evolving performance and game.

The connections of the LED’s all broke during the Winter Show, which indicates the need for rethinking how to include light interactive in the gloves.

animation – Augmented Reality

The Flying XRiver

For this assignment, we were tasked to create an augmented reality experiment, focused on the concept rather than a final result. I worked together with Maxwell da Silve and so we created the Flying XRiver – a 3D Sculpture reflecting on concepts such as the Anthropocene and the Flying Rivers of the Amazon Rainforest

Base Concepts

Anthropocene

The current geological age viewed as the period during which human activity has been the dominant influence on climate and the environment.

Geological ages of the Earth
The Flying Rivers
Amazon’s Rainforest Flying Rivers

The flying rivers are a movement of large quantities of water vapor transported in the atmosphere from the Amazon Basin to other parts of South America. The forest trees release water vapor into the atmosphere through transpiration and this moisture is deposited in other localities in the form of precipitation, forming a virtual river. This movement is essential for climate regulation and humidity balance in several parts of the world. Unfortunately, deforestation is increasing the number of fires in the Amazon Rainforest, which is eventually making the flying rivers transport toxic smoke, especially to the southeast of Brazil.

Process of creation

First, we found a 3D hand model at Free3D.com and downloaded it. Later, we imported it to Photoshop as a 3D object and used the brush tool to draw the river on the palm of the hand and so we exported it as a texture/material as seen below.

Texture/Material for the hands. The blue lines were created on Photoshop.

The next step was putting together the hands in Unity in the shape of a river and then adding the river and the smoke layers.

Connecting the hands in Unity

Results

Ar Sculpture

As a result, we have an AR sculpture that can be installed on flat surfaces such as any floor, tables, or other ceilings and immerse yourself in the Flying XRiver. The hands create a metaphor for our responsibility when touching natural resources. At the same time, we have the river on our hands, our hands are floating on the river, so the way we interfere in its dynamics affects the way we move, behave and relate to other organisms.

The hands can also be seen as a prototype for future interactions with the piece, in which we can explore social AR experiments, where people will be able to manipulate the position of the hands generating new collective structures, creating new landscapes and objects in the theme of forests.