Nature of Code – Random Walks

random river-drawing program

As the first assignment for the Nature of Code class, I explored Random Walks. Ideally, I wanted to create a 3D Random Walk visualization similar to this one. However, I couldn’t find the problem with my code, even working together with my fellows from the Coding Lab. I really intend to focus on creating 3D graphics in this class, although I feel I still need to continue learning the basics of coding. Thus, I’m figuring out the best strategies to mix both intentions.

Having failed in the 3D attempt, I went back to my first experiments with 2D random walks and tried to develop it further to find ways of creating a drawing program with interesting aesthetics results. I first understood that creating random walks that start on the top of the canvas and finish on the bottom could create a visual effect that reminds me of the movement of the water touching the sand at the beach. I also realized a way of writing the code drawing lines instead of points, to create a more continuous effect. The results are pretty abstract – which interests me. My first version of this sketch can be found here.

Later, I started playing with colors using the same variables as I had used to draw the line of the random walk. This made me realize that I wanted to have more control over the color and the shape of the line while it was being drawn.

To do so, I decided to work with 4 different sliders, so I can change the Hue and the Brightness of the lines while they are being drawn, as well as how much I want the line variate in the X and Y axes. This final code can be found here. And some results can be seen below. I tried to meditate about contemporary ecology issues as a conceptual background while playing with the drawing program. Two main events came into my mind: recent oil spill in the northeast of the Brazilian coast and the fires in the Amazon Rainforest.

Final Pcomp Project

For my final Physical Computing assignment, I’ve created gloves that generate different light, sound and animation effects according to the gestures of the performer that embodies the Fire Bird to tell their story. For more details about the storytelling part of the project, click here.

Video by Tundi Szasz
Different LED effects coordinate to hand gestures

Find below my Arduino Code:

//The Fire Bird

//This code was created during Fall 2019 at ITP NYU by Fernando Gregório

//p5.js code:
//Main sources used to create the code:
//For the Serial Communication with the Arduino IMU:
//For the Sound Frequency Modulation 
//For the LED's: Maxwell da Silva (link github)
//Coding assistance: Maxwell da Silva

#include <Arduino_LSM6DS3.h>
#include <MadgwickAHRS.h>

// initialize a Madgwick filter:
Madgwick filter;
// sensor's sample rate is fixed at 104 Hz:
const float sensorRate = 104.00;

// values for orientation:
float roll = 0.0;
float pitch = 0.0;
float heading = 0.0;

// A basic everyday NeoPixel strip test program.

// NEOPIXEL BEST PRACTICES for most reliable operation:
// - Add 1000 uF CAPACITOR between NeoPixel strip's + and - connections.
// - MINIMIZE WIRING LENGTH between microcontroller board and first pixel.
// - NeoPixel strip's DATA-IN should pass through a 300-500 OHM RESISTOR.
// - AVOID connecting NeoPixels on a LIVE CIRCUIT. If you must, ALWAYS
//   connect GROUND (-) first, then +, then data.
// - When using a 3.3V microcontroller with a 5V-powered NeoPixel strip,
// (Skipping these may work OK on your workbench but can fail in the field)
#define TIMECTL_MAXTICKS  4294967295L
#define TIMECTL_INIT      0
long time;
unsigned long flashTimeMark = 0;
unsigned long flashTimeMark2 = 0;
long interval = 2000;
int periode = 2000;
long previousMillis = 0;
int alpha;
#include <Adafruit_NeoPixel.h>
#ifdef __AVR__
#include <avr/power.h> // Required for 16 MHz Adafruit Trinket

// Which pin on the Arduino is connected to the NeoPixels?
// On a Trinket or Gemma we suggest changing this to 1:
#define LED_PIN    6

// How many NeoPixels are attached to the Arduino?
#define LED_COUNT 4

// Declare our NeoPixel strip object:
Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);
// Argument 1 = Number of pixels in NeoPixel strip
// Argument 2 = Arduino pin number (most are valid)
// Argument 3 = Pixel type flags, add together as needed:
//   NEO_KHZ800  800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
//   NEO_KHZ400  400 KHz (classic 'v1' (not v2) FLORA pixels, WS2811 drivers)
//   NEO_GRB     Pixels are wired for GRB bitstream (most NeoPixel products)
//   NEO_RGB     Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
//   NEO_RGBW    Pixels are wired for RGBW bitstream (NeoPixel RGBW products)

// setup() function -- runs once at startup --------------------------------
const int indexSwitch = 2; //define the normally opened switch activated by the index finger in the PIN 2
const int thumbSwitch = 3; //define the normally opened switch activated by the thumb finger in the PIN 3

void setup() {

 pinMode(indexSwitch, INPUT); //define the PIN as an INPUT
  pinMode(thumbSwitch, INPUT); //define the PIN as an INPUT

  // These lines are specifically to support the Adafruit Trinket 5V 16 MHz.
  // Any other board, you can remove this part (but no harm leaving it):
#if defined(__AVR_ATtiny85__) && (F_CPU == 16000000)
  // END of Trinket-specific code.

  strip.begin();           // INITIALIZE NeoPixel strip object (REQUIRED);            // Turn OFF all pixels ASAP
  strip.setBrightness(20); // Set BRIGHTNESS to about 1/5 (max = 255)

//  // attempt to start the IMU:
  if (!IMU.begin()) {
    Serial.println("Failed to initialize IMU");
    // stop here if you can't access the IMU:
    while (true);
  // start the filter to run at the sample rate:

void loop() {
  // values for acceleration & rotation:
  float xAcc, yAcc, zAcc;
  float xGyro, yGyro, zGyro;

  // check if the IMU is ready to read:
  if (IMU.accelerationAvailable() &&
      IMU.gyroscopeAvailable()) {
    // read accelerometer & gyrometer:
    IMU.readAcceleration(xAcc, yAcc, zAcc);
    IMU.readGyroscope(xGyro, yGyro, zGyro);

    // update the filter, which computes orientation:
    filter.updateIMU(xGyro, yGyro, zGyro, xAcc, yAcc, zAcc);

    // print the heading, pitch and roll
    roll = filter.getRoll();
    pitch = filter.getPitch();
    heading = filter.getYaw();

  // if you get a byte in the serial port,
  // send the latest heading, pitch, and roll:
  if (Serial.available()) {
    char input =;   
    if (digitalRead(indexSwitch) == HIGH && digitalRead(thumbSwitch) == HIGH) { //if both switches are pressed
      Serial.print(4); //write the number 4 in the serial monitor
    } else if (digitalRead(indexSwitch) == HIGH) { //if the index finger button is pressed
     Serial.print(2); //write the number two in the serial monitor
    } else if (digitalRead(thumbSwitch) == HIGH) {
      Serial.print(3); //write the number 3 in the serial monitor
    } else {
      Serial.print(0); //write zero in the serial monitor

void blink(){
  uint32_t white = strip.Color(255, 255, 255);

void tree() {
 uint32_t green = strip.Color(0, 200, 0);


void bird() {

  uint32_t blue = strip.Color(0, 0, 200);

void fire() {
 uint32_t blue = strip.Color(200, 0, 0);

void setPixelColorByRange(int start, int _end, int r, int g, int b) {
  while (start < _end) {
    if (waitTime(&flashTimeMark, 3)) {
      strip.setPixelColor(start, strip.Color(r, g, b));

void fadePixels(int time, int r, int g, int b) {
  alpha = 128 + 127 * cos(2 * PI / periode * time);
  tintPixels(r, g, b, alpha);

void tintPixels(int r, int g, int b, int a) {
  for (uint16_t i = 0; i < strip.numPixels(); i++) {
    uint32_t c = strip.Color(r, g, b);
    strip.setPixelColor(i, c);

float linearTween (float t, float b, float c, float d) {
  return c * t / d + b;

float easeInOutSine (float t, float b, float c, float d) {
  return -c / 2 * (cos(M_PI * t / d) - 1) + b;

float easeInBounce (float t, float b, float c, float d) {
  return c - easeOutBounce (d - t, 0, c, d) + b;

float easeOutBounce (float t, float b, float c, float d) {
  if ((t /= d) < (1 / 2.75)) {
    return c * (7.5625 * t * t) + b;
  } else if (t < (2 / 2.75)) {
    return c * (7.5625 * (t -= (1.5 / 2.75)) * t + .75) + b;
  } else if (t < (2.5 / 2.75)) {
    return c * (7.5625 * (t -= (2.25 / 2.75)) * t + .9375) + b;
  } else {
    return c * (7.5625 * (t -= (2.625 / 2.75)) * t + .984375) + b;

float easeInOutBounce (float t, float b, float c, float d) {
  if (t < d / 2) {
    return easeInBounce (t * 2, 0, c, d) * .5 + b;
  return easeOutBounce (t * 2 - d, 0, c, d) * .5 + c * .5 + b;

int waitTime(unsigned long *timeMark, unsigned long timeInterval) {
  unsigned long timeCurrent;
  unsigned long timeElapsed;
  int result = false;
  timeCurrent = millis();
  if (timeCurrent < *timeMark) {
    timeElapsed = (TIMECTL_MAXTICKS - *timeMark) + timeCurrent;
  } else {
    timeElapsed = timeCurrent - *timeMark;
  if (timeElapsed >= timeInterval) {
    *timeMark = timeCurrent;
    result = true;
  return (result);

For more details about the code and the serial communication, click here.

User Interaction

User interaction indicating the possibility of creating a game with the device

After showing the project to different people who could try the glove, I understood that it has an interesting potential to be a cross-media narrative evolving performance and game.

The connections of the LED’s all broke during the Winter Show, which indicates the need for rethinking how to include light interactive in the gloves.

animation – Augmented Reality

The Flying XRiver

For this assignment, we were tasked to create an augmented reality experiment, focused on the concept rather than a final result. I worked together with Maxwell da Silve and so we created the Flying XRiver – a 3D Sculpture reflecting on concepts such as the Anthropocene and the Flying Rivers of the Amazon Rainforest

Base Concepts


The current geological age viewed as the period during which human activity has been the dominant influence on climate and the environment.

Geological ages of the Earth
The Flying Rivers
Amazon’s Rainforest Flying Rivers

The flying rivers are a movement of large quantities of water vapor transported in the atmosphere from the Amazon Basin to other parts of South America. The forest trees release water vapor into the atmosphere through transpiration and this moisture is deposited in other localities in the form of precipitation, forming a virtual river. This movement is essential for climate regulation and humidity balance in several parts of the world. Unfortunately, deforestation is increasing the number of fires in the Amazon Rainforest, which is eventually making the flying rivers transport toxic smoke, especially to the southeast of Brazil.

Process of creation

First, we found a 3D hand model at and downloaded it. Later, we imported it to Photoshop as a 3D object and used the brush tool to draw the river on the palm of the hand and so we exported it as a texture/material as seen below.

Texture/Material for the hands. The blue lines were created on Photoshop.

The next step was putting together the hands in Unity in the shape of a river and then adding the river and the smoke layers.

Connecting the hands in Unity


Ar Sculpture

As a result, we have an AR sculpture that can be installed on flat surfaces such as any floor, tables, or other ceilings and immerse yourself in the Flying XRiver. The hands create a metaphor for our responsibility when touching natural resources. At the same time, we have the river on our hands, our hands are floating on the river, so the way we interfere in its dynamics affects the way we move, behave and relate to other organisms.

The hands can also be seen as a prototype for future interactions with the piece, in which we can explore social AR experiments, where people will be able to manipulate the position of the hands generating new collective structures, creating new landscapes and objects in the theme of forests.

ICM Final

The first sound of the Fire Bird’s Eye

For my final assignment for Introduction to Computational Media, I’ve added a sound layer to my gloves designed for my Introduction to Physical Computing Classes. The gloves are being thought for a performing arts environment, where I’m going to embody this character called Fire Bird to tell a story. They have 2 switches that are activated when touching the index fingers or the thumbs of both hands, producing different LED effects that change according to different combinations of the switches.

Design of the Gloves made with reflective fabric

For the ICM part of it, I’m connecting my Arduino using p5.serialcontrol to create the serial communication with p5.js online editor. The built-in IMU (Inertial Movement Unit composed by an accelerometer and a gyroscope sensor) of the Arduino IOT 33 is collecting three different movement data from the gloves: roll, pitch and heading (see image below to visualize the difference between them), which are the spinning movements that a 3D body can make on the space. And sending them to my p5 sketch with a range of 0 to 360.

Data collected by the accelerometer/gyroscope of the Arduino. Yaw can also be called Heading.

This range is being mapped to change the frequency and the amplitude of the carrier wave in a Frequency Modulation system in its simplest form and generate different sounds in real-time. It’s also moving a 3D object that I designed in p5.js. Initially, it was supposed to look like an eye, but now I feel that it became a spaceship.

For the sound, we basically have two oscillators. One is the carrier, and the other is the modulator. The carrier has a Base Frequency that is modified by the modulator, which operates in a range of amplitude and frequency chosen by me.

Find my code here. To create it, I basically adapted and combined two codes. The first one, used to create serial communication with the Arduino IMU was created by Tom Igoe and can be found here. The second is a sound frequency modulation example found in the p5js examples library. I really feel like studying deeper Frequency Modulation and also understanding better how Matrixes work in coding.

Still of my sketch/animation

The Fire Bird

Body experimentation wearing the interactive gloves. Video by Tundi Szász.

The Fire Bird is a first experiment on building characters exploring interactive wearable technologies. This is a work in progress. I’m imagening here a performing arts environment where the performer will be able to control different elements of the space making different gestures.

To compose the Fire Bird so far, I’ve created gloves that generate different light, sound and animation effects according to the hand gestures and movements of the performer that embodies the Fire Bird to tell their story. Basically, while wearing the gloves, the performer can change the direction of a 3D object on the screen and the frequency and the amplitude of its sound. When he/she make gestures related to the narrative of the Fire Bird (Fire, Bird or Tree), different buttons made with conductive fabric sewed to the gloves are activated changing the color of the LED’s of the gloves and the color and size of the seed (3D object) on the screen. The seed is inside the belly of the Fire Bird, so when someone wear the gloves, he/she is embodying the carachter and flying or dreaming carrying the seed.

The LED effects of the glove change according to gestures

Process of Creation

The work was made during Fall 2019 at ITP, manly in three classes. For more details, click on each part of the following description. In Intro to Fabrication, I fabricated the gloves and the acrylic mask of the Fire Bird. For my Introduction to Computational Media final, I created the p5.js code with the 3D object and the frequency modulation sound system coordinated to the movements of the hands. This is made possible after a serial communication with the Inertial Motion Unit built-in the Arduino IOT of the gloves, which code and circuit were developed during the Introduction to Physical Computing classes.

If you would like to know more about the project, please contact me:

The story of the Fire Bird

The story of the Fire Bird is based on three cells of movements. The idea is to develop it further coreographycally, text-based, and possibly in the format of a game.

Movement 1

The Fire Bird flies away from home because their house was put on fire by someone who couldn’t understand the way the Fire Bird was living.

Movement 2

The Fire Bird flies to the city, alone, and finds a place to sleep.

Movement 3

The Fire Bird dreams for so long, that the invisible seed that was inside of their belly grows and turns into a tree.

Questions to keep moving

What’s the seed that is inside of the belly of the Fire Bird made of?

How does the tree that grows from the belly of the Fire Bird look like?

fabrication – week 6

The fire flower eye motor

For the mounting motor’s assignment, I created a simple sculpture inspired by the character I’ve been developing this semester: The Fire Bird. The initial idea was to work with handshapes that spin creating a fire illusional effect. But it turns out that the DC motor that I used stops very easily when in contact with extra weight, so I simplified the idea and used an acrylic eye that I had instead. It still creates a fire-flower effect when spins, which turned out as an interesting aesthetic experiment for me.

Initial Idea

The hardest part was positioning the hole of the enclosure in a way that it forms a perfect angle with the motor, and does not stop because of the friction and spins without shaking the box.

Movement test


  • Zip ties;
  • 3mm Shaft / Axle;
  • Black ABS Project box;
  • Euro terminal Strip pin;
  • 9V Battery;
  • Hobby DC motor;
  • Switch;
  • Wires.

Final Result

Final Result

fabrication – week 5

two materials assignment

This week we were tasked to work with two different materials. I’ve been venturing myself in the wearable world, so I took the opportunity to learn how to use the sewing machine and understand how to connect wires with conductive thread or fabrics. The goal was to create gloves with LEDs that change their color effects depending on the gesture you make when wearing them.


  • Conductive Fabric;
  • Conductive Thread;
  • Perf board;
  • Silicon wires;
  • Conductive tape;
  • Thread;
  • Reflective Fabric;
  • Stretch Grey Fabric;
  • Neopixels;

Pcomp Final Project

The fire bird light and sound glove

Glove’s palm diagram
Back of the hand diagram and details for the finger’s buttons

storytelling bases for coreography

Movement 1: The fire bird flies to the city.

Movement 2: The fire bird finds a place to die and dies.

Movement 3: The invisible seed that was inside of the belly of the fire bird geminates.

Movement 4: Zoom in the seed and enter it. What is the seed made of?

Movement 5: The seed turns into a tree. How does this tree look like?



  • Arduino Nano 33 IOT;
  • Portable Battery Pack;
  • Battery for Arduino (Lithium?!);
  • Micro USB – USB Cable;
  • Portable speakers with Bluetooth connection;
  • Gloves;
    • Conductive Fabric;
    • Conductive Thread;
    • Reflective Fabric;
  • Silicon wires;
  • Computer with p5.js;

Next Steps

  • Finish Code;
    • Create each LED effect;
      • Initial;
      • Fire;
      • Bird;
      • Tree;
    • Create sound effects in P5js to produce the sounds connected to accelerometer and gyroscope and in different versions for each LED effect;
  • Build the gloves;
    • Draw the gloves in the fabric;
    • Sew Neopixels to the glove;
    • Build an enclosure to wear the arduino nano on my hand;
  • Work on portable version;
    • Create bluetooth connection;
    • Find the best battery to power it;

animation – after effects

Final Result was created by me together with Wen Chen. We started imagining a future when we’ll be able to built ourselves and design body parts in the real world with advanced augmented reality technologies, before going to a casual date. We also talked about how meeting apps change our behavior in society and modify our subjectivity, especially as gay people, who are a big target audience for this industry.



fabrication – week 4

Wearable acrylic enclosure

This week we were tasked to create an enclosure. I made a simple experiment for a wearable using acrylic cut in the laser cutter and standoffs. The idea is to wear the Arduino Nano 33 IOT, which has a built-in accelerometer and gyroscope, to collect movement data from my hands, which are going to be used to create sound and visual effects with a computer.