Nature of Code – Particles Systems

The web Fire Bird

For this assignment, I’ve started creating a web version for the Fire Bird mythology, that I’ve been working on since last semester. I want to find an interesting and compelling way of telling the story in the web-browser using text, visual effects, and simple interactivity. I’m creating this as an experimental art piece for the browser, but it can also be seen as a first step towards designing a game based on this character’s story. Find the code here.

The Triangles represent fire seeds that are thrown by the Fire Bird

The story of the Fire Bird is based on recent environmental disasters caused by fires in the Amazon Rainforest and has three movements: the Fire Bird flies away from home because their house was put on fire by someone who couldn’t understand the way the Fire Bird was living; the Fire Bird flies to the city, alone, carrying a fire seed inside their belly and finds a place to sleep; the Fire Bird dreams for so long, that the invisible seed that was inside of their belly grows and turns into a tree. For this sketch, I’m focusing on the part of the narrative in which they fly away from home to the city.

The Fire Bird is moved by the mouse position in the canvas

The code has an image that moves and changes its direction according to the mouse position (the cursor is hidden by the function “noCursor();”) using a “Vehicle Class“. When you click the mouse, a rotating triangle flies away from the center of the image and is affected by a gravity force until it disappears after the bottom of the canvas. The “PartycleSystem” class is being used to push a new particle into an empty array (particles) and remove it after the particle is “dead” as well as applying the force to all particles. The death is caused by a lifespan variable inside the class Particle, which acts as a countdown timer. It has an initial value and when it becomes 0, the particle is dead. Another important point to remember is that I’m connecting the position of the image and the position from where the triangles are thrown with the sentence: “system.addParticle(v.position.x, v.position.y);”. The “addParticle” function is inside the “SystemParticles” class, and its “x” and “y” variables are connected to the position vector of the constructor of the Particle class, which itself is being affected by the vehicle class.

The next step is creating the landscape for the sketch and finding an interesting design to write the story of the Fire Bird as part of it. I’ve been thinking about drawing the landscape with moving text…I have started to try a different behavior fot the seed when it heats the ground (which has to be created) in this code.

Experiments in AR – Case Study

4th Wall AR App

4th wall is defined as “a free, augmented reality (AR) public art platform exploring resistance and inclusive creative expression”. It was created by the mixed media artist Nancy Baker Cahill in partnership with app developer Drive Studio.

Nancy Baker takes her dimensional drawings on paper, and then finish them on VR and ends up translating them into AR and share them through 4th Wall, so the user “can create their own context and content with the works, locating them anywhere in the world”. In this version of the app, there are 4 different versions of her AR artworks and another page called Coordinates, which is a “curatorial and site-specific AR public art project”, where Baker installs geo-located artworks all over the world on public spaces, especially addressing political issues, such as frontiers.

I discovered this app while searching for augmented reality examples in the contemporary art world. Although AR is becoming more common in this context, with big international artists such as Marina Abramovic exploring it, I was expecting to find more cases than it actually happened. In general, I feel that the most common for contemporary artists is to translate to AR a work that could be done in another medium, but not reflect its potentiality as uniqueness in a more metalinguistic approach and critically engaged perspective.

I ended up choosing 4th wall because its creator has a declares politic approach to the project. In Nancy’s words “4th Wall serves as a way to collectively share an augmented experience without expensive or inaccessible VR headsets and technology”. Thus, it breaks the 4th wall of the traditional art space, such as a gallery, and allows everybody that has a cellphone to access its content from anywhere. Indeed, I’ve been thinking a lot about this power on accessing and creating AR with relatively cheap tools, especially after hearing this podcast, in which Zach Lieberman and Molmol Kuo point that as well as an interesting peculiarity of this medium. On the other hand, who will get access to the existence of an app such as the 4th Wall and how? It is accessible, but is the information that it exists enough accessible?

I find interesting that it is built in its own app because it can create a unique user experience that dialogues better with its own goals. If it was on Instagram for instance, more people would have access to it, but the user experience would be limited to the logics of another platform, not necessarily designed with the same intentions. How public or free is Instagram (considering, for example, the amount of compulsory publicity in it and data collected from the users) if compared to the 4th Wall app?

Video found in the project’s website

The experience is artistic and I found interesting experiencing her drawings in different spaces and interacting with them, entering them, kind of dancing the visualization of the drawing. Somehow you embody the body of the artist when visualizing it and can create different compositions with space, which kinds of makes you part of the process of creation. I guess this project is more focused on the art community as a target audience, because it has a more abstract approach, although it provides an experience that can be easily enjoyable for anyone slightly interested in art and tech.

Although the interface of the app is simplistic and does not look well elaborated, it has all the functions it needs and works pretty well – you can move the object with touchscreen, take a picture or make a video. I haven’t tried the geo-located ones, but feel like doing it, and the app kind of makes you feel it with a simple arrow pointing the way to the closest artwork and giving you easy access to a map with the exact location. This technical sophistication of geolocating the works and installing them anywhere in the world really makes me fill excited, although I have heard from people who have worked with it that it is not precise, so it works more for large scales, where you don’t need to be accurate with the location itself.

“Is AR breaking down walls”? The creator of the 4th Wall app answers yes enthusiastically. I agree but not completely. It has great potential, it can be cheap to create and if designed for mobiles, but it guarantees access just for those who have a mobile connected to the internet (which is not a global reality thinking about capitalistic catastrophes of poverty production). It also does not really open access to canonized art spaces – it creates other kinds of spaces, which has another kind of potential. At the same time, it still can interfere in an interesting way inside those spaces.

Electronic Rituals, Oracles_Meditation #1

Tempo Temple

For this meditation, I am activating a perspective where I see dancing and drawing as a way of thinking and as a way of producing and embodying knowledge.

One of the results points to the possibility of drawing masks

Tempo temple is the first prototype for a video-performance where I try to draw a temple inside a temple which is inside another temple which is inside another temple and so on…

Tempo meaning: the rate or speed of motion or activity; pace.



the colonizer neoliberal capitalistic project is enchanting our times in order to build a giant world-wide magic circle where there is the illusion of space for just one temple,

Is it possible to play a game inside that circle that builds a temple inside a

a temple, a white temple, an invisible temple, the white snake temple, a dancing and drawing temple that

creates a temporary space where an encounter makes our perception of time enough modified, so that we can see it from outside?

is it possible to do it alone? am I alone when doing it?

when drawing a line with a random dance inside a machine learning temple, which forces are part of the code? What am I embodying? What am I touching in considering all its layers of creation and intention?

and which forces would never be into this temple? If the temple were empty, what would be inside of it?

Random dance resulting in a tree

The video was created interacting with a webcam connected to p5.js using PoseNet. It was based in an example by Daniel Shiffman called “PoseNet example using p5.js to draw with your nose”. I basically changed the nose for both wrists and draw lines instead of circles. Find the code here.

For the future of the project, I’d like to try to draw a line that disappears after some time; think about a better setting, specially for the background; and find a way of creating it in high resolution.

Soft Robotics – Cable Experiments, Morph, Materials research

Cable Experiments

Straw, thread and beads
curtain experiment
multiple straws spider experiment


Here is a list of interesting references that I found during my research this week with examples of projects in the field of soft robotics.

Artist Onyx Ashanti

Soft opportunities in a Toy Store

For my soft materials research, I went to FAO Schwarz at the Rockefeller center, because I thought that there I would find the biggest variety of toys. In general, I felt disappointed with the diversity of soft materials in the store in general. Most of them could be found in teddy bears made of the same materials. Unfortunately, I could not find any inflatable or pneumatic system, or even floating objects, in which I’m interested in.

List of materials that I found:

  • Polyester Fiber was the most common material in the store, for sure, found in teddy bears, dolls clothes, etc. It feels really comfortable and cozy to touch it, so it can be used in projects with human or pets interaction which needs to generate those feelings.
  • Polyurethane Foam, found in a sponge. It’s no that soft, but can be used in projects that need a squishy compression effect. It also can float on the water which can interesting.
  • Gummy made of a mixture of syrups, water, gelatin, and other substances. It’s interesting to think about the possibility of working with eatables in the universe of robots, like moving food.
  • Magic snow? – I really don’t know what that was made of and it was not written on the package. It felt really good to touch it, but it would be hard to create anything from it because it has not enough consistency to build shapes and forms.
  • *The deer horn is made of a polymer that I don’t know the name as well. It was not really soft, but it had some movement when you press it.
On my way back home, I found softness being used in publicity referring it as a texture and a human behavior

Nature of Code – Random Walks

random river-drawing program

As the first assignment for the Nature of Code class, I explored Random Walks. Ideally, I wanted to create a 3D Random Walk visualization similar to this one. However, I couldn’t find the problem with my code, even working together with my fellows from the Coding Lab. I really intend to focus on creating 3D graphics in this class, although I feel I still need to continue learning the basics of coding. Thus, I’m figuring out the best strategies to mix both intentions.

Having failed in the 3D attempt, I went back to my first experiments with 2D random walks and tried to develop it further to find ways of creating a drawing program with interesting aesthetics results. I first understood that creating random walks that start on the top of the canvas and finish on the bottom could create a visual effect that reminds me of the movement of the water touching the sand at the beach. I also realized a way of writing the code drawing lines instead of points, to create a more continuous effect. The results are pretty abstract – which interests me. My first version of this sketch can be found here.

Later, I started playing with colors using the same variables as I had used to draw the line of the random walk. This made me realize that I wanted to have more control over the color and the shape of the line while it was being drawn.

To do so, I decided to work with 4 different sliders, so I can change the Hue and the Brightness of the lines while they are being drawn, as well as how much I want the line variate in the X and Y axes. This final code can be found here. And some results can be seen below. I tried to meditate about contemporary ecology issues as a conceptual background while playing with the drawing program. Two main events came into my mind: recent oil spill in the northeast of the Brazilian coast and the fires in the Amazon Rainforest.

Final Pcomp Project

For my final Physical Computing assignment, I’ve created gloves that generate different light, sound and animation effects according to the gestures of the performer that embodies the Fire Bird to tell their story. For more details about the storytelling part of the project, click here.

Video by Tundi Szasz
Different LED effects coordinate to hand gestures

Find below my Arduino Code:

//The Fire Bird

//This code was created during Fall 2019 at ITP NYU by Fernando Gregório

//p5.js code:
//Main sources used to create the code:
//For the Serial Communication with the Arduino IMU:
//For the Sound Frequency Modulation 
//For the LED's: Maxwell da Silva (link github)
//Coding assistance: Maxwell da Silva

#include <Arduino_LSM6DS3.h>
#include <MadgwickAHRS.h>

// initialize a Madgwick filter:
Madgwick filter;
// sensor's sample rate is fixed at 104 Hz:
const float sensorRate = 104.00;

// values for orientation:
float roll = 0.0;
float pitch = 0.0;
float heading = 0.0;

// A basic everyday NeoPixel strip test program.

// NEOPIXEL BEST PRACTICES for most reliable operation:
// - Add 1000 uF CAPACITOR between NeoPixel strip's + and - connections.
// - MINIMIZE WIRING LENGTH between microcontroller board and first pixel.
// - NeoPixel strip's DATA-IN should pass through a 300-500 OHM RESISTOR.
// - AVOID connecting NeoPixels on a LIVE CIRCUIT. If you must, ALWAYS
//   connect GROUND (-) first, then +, then data.
// - When using a 3.3V microcontroller with a 5V-powered NeoPixel strip,
// (Skipping these may work OK on your workbench but can fail in the field)
#define TIMECTL_MAXTICKS  4294967295L
#define TIMECTL_INIT      0
long time;
unsigned long flashTimeMark = 0;
unsigned long flashTimeMark2 = 0;
long interval = 2000;
int periode = 2000;
long previousMillis = 0;
int alpha;
#include <Adafruit_NeoPixel.h>
#ifdef __AVR__
#include <avr/power.h> // Required for 16 MHz Adafruit Trinket

// Which pin on the Arduino is connected to the NeoPixels?
// On a Trinket or Gemma we suggest changing this to 1:
#define LED_PIN    6

// How many NeoPixels are attached to the Arduino?
#define LED_COUNT 4

// Declare our NeoPixel strip object:
Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);
// Argument 1 = Number of pixels in NeoPixel strip
// Argument 2 = Arduino pin number (most are valid)
// Argument 3 = Pixel type flags, add together as needed:
//   NEO_KHZ800  800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
//   NEO_KHZ400  400 KHz (classic 'v1' (not v2) FLORA pixels, WS2811 drivers)
//   NEO_GRB     Pixels are wired for GRB bitstream (most NeoPixel products)
//   NEO_RGB     Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
//   NEO_RGBW    Pixels are wired for RGBW bitstream (NeoPixel RGBW products)

// setup() function -- runs once at startup --------------------------------
const int indexSwitch = 2; //define the normally opened switch activated by the index finger in the PIN 2
const int thumbSwitch = 3; //define the normally opened switch activated by the thumb finger in the PIN 3

void setup() {

 pinMode(indexSwitch, INPUT); //define the PIN as an INPUT
  pinMode(thumbSwitch, INPUT); //define the PIN as an INPUT

  // These lines are specifically to support the Adafruit Trinket 5V 16 MHz.
  // Any other board, you can remove this part (but no harm leaving it):
#if defined(__AVR_ATtiny85__) && (F_CPU == 16000000)
  // END of Trinket-specific code.

  strip.begin();           // INITIALIZE NeoPixel strip object (REQUIRED);            // Turn OFF all pixels ASAP
  strip.setBrightness(20); // Set BRIGHTNESS to about 1/5 (max = 255)

//  // attempt to start the IMU:
  if (!IMU.begin()) {
    Serial.println("Failed to initialize IMU");
    // stop here if you can't access the IMU:
    while (true);
  // start the filter to run at the sample rate:

void loop() {
  // values for acceleration & rotation:
  float xAcc, yAcc, zAcc;
  float xGyro, yGyro, zGyro;

  // check if the IMU is ready to read:
  if (IMU.accelerationAvailable() &&
      IMU.gyroscopeAvailable()) {
    // read accelerometer & gyrometer:
    IMU.readAcceleration(xAcc, yAcc, zAcc);
    IMU.readGyroscope(xGyro, yGyro, zGyro);

    // update the filter, which computes orientation:
    filter.updateIMU(xGyro, yGyro, zGyro, xAcc, yAcc, zAcc);

    // print the heading, pitch and roll
    roll = filter.getRoll();
    pitch = filter.getPitch();
    heading = filter.getYaw();

  // if you get a byte in the serial port,
  // send the latest heading, pitch, and roll:
  if (Serial.available()) {
    char input =;   
    if (digitalRead(indexSwitch) == HIGH && digitalRead(thumbSwitch) == HIGH) { //if both switches are pressed
      Serial.print(4); //write the number 4 in the serial monitor
    } else if (digitalRead(indexSwitch) == HIGH) { //if the index finger button is pressed
     Serial.print(2); //write the number two in the serial monitor
    } else if (digitalRead(thumbSwitch) == HIGH) {
      Serial.print(3); //write the number 3 in the serial monitor
    } else {
      Serial.print(0); //write zero in the serial monitor

void blink(){
  uint32_t white = strip.Color(255, 255, 255);

void tree() {
 uint32_t green = strip.Color(0, 200, 0);


void bird() {

  uint32_t blue = strip.Color(0, 0, 200);

void fire() {
 uint32_t blue = strip.Color(200, 0, 0);

void setPixelColorByRange(int start, int _end, int r, int g, int b) {
  while (start < _end) {
    if (waitTime(&flashTimeMark, 3)) {
      strip.setPixelColor(start, strip.Color(r, g, b));

void fadePixels(int time, int r, int g, int b) {
  alpha = 128 + 127 * cos(2 * PI / periode * time);
  tintPixels(r, g, b, alpha);

void tintPixels(int r, int g, int b, int a) {
  for (uint16_t i = 0; i < strip.numPixels(); i++) {
    uint32_t c = strip.Color(r, g, b);
    strip.setPixelColor(i, c);

float linearTween (float t, float b, float c, float d) {
  return c * t / d + b;

float easeInOutSine (float t, float b, float c, float d) {
  return -c / 2 * (cos(M_PI * t / d) - 1) + b;

float easeInBounce (float t, float b, float c, float d) {
  return c - easeOutBounce (d - t, 0, c, d) + b;

float easeOutBounce (float t, float b, float c, float d) {
  if ((t /= d) < (1 / 2.75)) {
    return c * (7.5625 * t * t) + b;
  } else if (t < (2 / 2.75)) {
    return c * (7.5625 * (t -= (1.5 / 2.75)) * t + .75) + b;
  } else if (t < (2.5 / 2.75)) {
    return c * (7.5625 * (t -= (2.25 / 2.75)) * t + .9375) + b;
  } else {
    return c * (7.5625 * (t -= (2.625 / 2.75)) * t + .984375) + b;

float easeInOutBounce (float t, float b, float c, float d) {
  if (t < d / 2) {
    return easeInBounce (t * 2, 0, c, d) * .5 + b;
  return easeOutBounce (t * 2 - d, 0, c, d) * .5 + c * .5 + b;

int waitTime(unsigned long *timeMark, unsigned long timeInterval) {
  unsigned long timeCurrent;
  unsigned long timeElapsed;
  int result = false;
  timeCurrent = millis();
  if (timeCurrent < *timeMark) {
    timeElapsed = (TIMECTL_MAXTICKS - *timeMark) + timeCurrent;
  } else {
    timeElapsed = timeCurrent - *timeMark;
  if (timeElapsed >= timeInterval) {
    *timeMark = timeCurrent;
    result = true;
  return (result);

For more details about the code and the serial communication, click here.

User Interaction

User interaction indicating the possibility of creating a game with the device

After showing the project to different people who could try the glove, I understood that it has an interesting potential to be a cross-media narrative evolving performance and game.

The connections of the LED’s all broke during the Winter Show, which indicates the need for rethinking how to include light interactive in the gloves.

animation – Augmented Reality

The Flying XRiver

For this assignment, we were tasked to create an augmented reality experiment, focused on the concept rather than a final result. I worked together with Maxwell da Silve and so we created the Flying XRiver – a 3D Sculpture reflecting on concepts such as the Anthropocene and the Flying Rivers of the Amazon Rainforest

Base Concepts


The current geological age viewed as the period during which human activity has been the dominant influence on climate and the environment.

Geological ages of the Earth
The Flying Rivers
Amazon’s Rainforest Flying Rivers

The flying rivers are a movement of large quantities of water vapor transported in the atmosphere from the Amazon Basin to other parts of South America. The forest trees release water vapor into the atmosphere through transpiration and this moisture is deposited in other localities in the form of precipitation, forming a virtual river. This movement is essential for climate regulation and humidity balance in several parts of the world. Unfortunately, deforestation is increasing the number of fires in the Amazon Rainforest, which is eventually making the flying rivers transport toxic smoke, especially to the southeast of Brazil.

Process of creation

First, we found a 3D hand model at and downloaded it. Later, we imported it to Photoshop as a 3D object and used the brush tool to draw the river on the palm of the hand and so we exported it as a texture/material as seen below.

Texture/Material for the hands. The blue lines were created on Photoshop.

The next step was putting together the hands in Unity in the shape of a river and then adding the river and the smoke layers.

Connecting the hands in Unity


Ar Sculpture

As a result, we have an AR sculpture that can be installed on flat surfaces such as any floor, tables, or other ceilings and immerse yourself in the Flying XRiver. The hands create a metaphor for our responsibility when touching natural resources. At the same time, we have the river on our hands, our hands are floating on the river, so the way we interfere in its dynamics affects the way we move, behave and relate to other organisms.

The hands can also be seen as a prototype for future interactions with the piece, in which we can explore social AR experiments, where people will be able to manipulate the position of the hands generating new collective structures, creating new landscapes and objects in the theme of forests.

ICM Final

The first sound of the Fire Bird’s Eye

For my final assignment for Introduction to Computational Media, I’ve added a sound layer to my gloves designed for my Introduction to Physical Computing Classes. The gloves are being thought for a performing arts environment, where I’m going to embody this character called Fire Bird to tell a story. They have 2 switches that are activated when touching the index fingers or the thumbs of both hands, producing different LED effects that change according to different combinations of the switches.

Design of the Gloves made with reflective fabric

For the ICM part of it, I’m connecting my Arduino using p5.serialcontrol to create the serial communication with p5.js online editor. The built-in IMU (Inertial Movement Unit composed by an accelerometer and a gyroscope sensor) of the Arduino IOT 33 is collecting three different movement data from the gloves: roll, pitch and heading (see image below to visualize the difference between them), which are the spinning movements that a 3D body can make on the space. And sending them to my p5 sketch with a range of 0 to 360.

Data collected by the accelerometer/gyroscope of the Arduino. Yaw can also be called Heading.

This range is being mapped to change the frequency and the amplitude of the carrier wave in a Frequency Modulation system in its simplest form and generate different sounds in real-time. It’s also moving a 3D object that I designed in p5.js. Initially, it was supposed to look like an eye, but now I feel that it became a spaceship.

For the sound, we basically have two oscillators. One is the carrier, and the other is the modulator. The carrier has a Base Frequency that is modified by the modulator, which operates in a range of amplitude and frequency chosen by me.

Find my code here. To create it, I basically adapted and combined two codes. The first one, used to create serial communication with the Arduino IMU was created by Tom Igoe and can be found here. The second is a sound frequency modulation example found in the p5js examples library. I really feel like studying deeper Frequency Modulation and also understanding better how Matrixes work in coding.

Still of my sketch/animation

The Fire Bird

Body experimentation wearing the interactive gloves. Video by Tundi Szász.

The Fire Bird is a first experiment on building characters exploring interactive wearable technologies. This is a work in progress. I’m imagening here a performing arts environment where the performer will be able to control different elements of the space making different gestures.

To compose the Fire Bird so far, I’ve created gloves that generate different light, sound and animation effects according to the hand gestures and movements of the performer that embodies the Fire Bird to tell their story. Basically, while wearing the gloves, the performer can change the direction of a 3D object on the screen and the frequency and the amplitude of its sound. When he/she make gestures related to the narrative of the Fire Bird (Fire, Bird or Tree), different buttons made with conductive fabric sewed to the gloves are activated changing the color of the LED’s of the gloves and the color and size of the seed (3D object) on the screen. The seed is inside the belly of the Fire Bird, so when someone wear the gloves, he/she is embodying the carachter and flying or dreaming carrying the seed.

The LED effects of the glove change according to gestures

Process of Creation

The work was made during Fall 2019 at ITP, manly in three classes. For more details, click on each part of the following description. In Intro to Fabrication, I fabricated the gloves and the acrylic mask of the Fire Bird. For my Introduction to Computational Media final, I created the p5.js code with the 3D object and the frequency modulation sound system coordinated to the movements of the hands. This is made possible after a serial communication with the Inertial Motion Unit built-in the Arduino IOT of the gloves, which code and circuit were developed during the Introduction to Physical Computing classes.

If you would like to know more about the project, please contact me:

The story of the Fire Bird

The story of the Fire Bird is based on three cells of movements. The idea is to develop it further coreographycally, text-based, and possibly in the format of a game.

Movement 1

The Fire Bird flies away from home because their house was put on fire by someone who couldn’t understand the way the Fire Bird was living.

Movement 2

The Fire Bird flies to the city, alone, and finds a place to sleep.

Movement 3

The Fire Bird dreams for so long, that the invisible seed that was inside of their belly grows and turns into a tree.

Questions to keep moving

What’s the seed that is inside of the belly of the Fire Bird made of?

How does the tree that grows from the belly of the Fire Bird look like?