The web Firebird

the spirit of the fractal tree

The web Firebird is a live performance experiment on coding, choreography, and augmented reality designed to be shared through the cyberspace. Find its code here.

This project was created during Spring 2020 at ITP NYU as part of the classes Nature of Code (Daniel Shiffman), Choreographic Interventions (Mimi Yin), and Experiments in Augmented Reality (Irene Alvarado).

screenshot of the performance streamed from my rooftop

Past experiments

The Firebird is a mythology that I started creating during my first semester at ITP. Firstly, as a physical computing project, where I started imagining an experimental performance art set up where the performer would be able to control audiovisual graphic elements of the space only with gestures and wearable technology.

During my second semester, as my Nature of Code’s midterm, I created the first version of the Firebird for the browser. The piece is a net-art an interactive visual poem, which can be found here, as part of the online residency Jururu Club, which is one of the Brazilian pavilions of the fourth edition of The Wrong Biennale.

The story of the Firebird

The Firebird used to live in a wet forest with their community. One day, without previous notice, their house is put on fire by someone who couldn’t understand the way they were living. And so the Firebird flies away from home. The Fire Bird migrates to the city, in the north, alone, and finds a place to sleep where no human could find them. The thing is that they were carrying an invisible fire seed inside their belly. So in the place where they sleep, they dream for so long, that the invisible seed that was inside of their belly finds time to germinate and turns into a tree.

example of an audience view of the final part of the performance

The story of the Fire Bird is based mainly on several environmental disasters caused by fires in the Amazon Rainforest in 2019 with the aim of keeping its memory and constantly question all the actions that lead the grown of the fires in that year, such as the considerable growth of deforestation encouraged by macropolitics representatives. I’m interested here in how interspecies life is affected by human culture and it’s catastrophic eco-system design during the Anthropocene.

The Firebird (1910) is also a ballet and orchestral concert work by the Russian composer Igor Stravinsky

How does it work?

For this project, I worked with ml5.js to load PoseNet along with p5.js. PoseNet is a machine learning model that allows for Real-time Human Pose Estimation with images from a regular digital camera, such as a webcam.

I’m focusing this post on the coding part of the work. However, there are other important layers to this project, such as developing the choreography and the character design in a way that I feel gradually more comfortable and clear about the way to embody it. This part takes time, a lot of rehearsal, and experimentation with the code, which basically is the component that animates this creature through my body, giving room for them to exist.

early experiments with PoseNet for this project

I’m choreographing the code working with the millis() function in the draw loop, which allows me to add or take away functions/effects overtime using conditional statements.

function draw() {
  let now = millis();
  // console.log(now);
  // Scale everything based on ratio between
  // window width and video width
  scale(width / video.width, width / video.width);
  // Shift everything up 50 pixels
  translate(0, -50);

  // Draw the video WITHOUT resizing it
  image(video, 0, 0);
  // textSize(128);
  // text(width/height, width/2, height/2);
  if (pose) {

    if (now >= 355000) {
      fireParticle(0, 11);

    } else if (now >= 320000) { //night
      fireParticle(0, 17);

    } else if (now >= 290000) { //fractal tree whole body on fire

      fireParticle(0, 17);

    } else if (now >= 260000) { //fractal tree on fire
      fireParticle(0, 11)

    } else if (now >= 230000) { //line tre
      fireParticle(0, 5);

    } else if (now >= 140000) { //seed appears
      fireParticle(0, 5);

    } else if (now >= 100000) { //fly to the city 
      fireParticle(0, 5); //fire face
      fireParticle(9, 11);

    } else if (now >= 75000) { //nest in fire, get out of nest
      fireParticle(0, 5); //fire face
      fireParticle(9, 11);

    } else if (now >= 15000) { //drawing nest

    } else { //start


So far, I’ve been presenting this work in Zoom meetings, sharing my screen with the audience, who will view my browser in fullscreen and present mode on p5.js online editor. That’s why I’m scaling everything in my draw loop based on the ratio between the window width and the video width. I feel that it is not the best solution because it makes it more difficult to think about other elements proportions etc, but probably a better one will come once I discover the best set up to stream the performance with the best quality possible.

function birdMask();
function birdMask() {
  let dF = dist(pose.rightEye.x, pose.rightEye.y, pose.leftEye.x, pose.leftEye.y);
  translate(pose.nose.x, pose.nose.y - dF);
  scale(dF / 50);
  image(img, 0, 0, 180, 120);

The birdmask function in a .png image with a transparent background that I designed in Illustrator and loaded into the code. It’s positioned according to the position of my nose and scaled in relation to the distance between my two eyes pose points.

function drawNest();
function drawNest() {
  image(pg, 0, 0, width, height);

  leftWristX = pose.leftWrist.x;
  leftWristY = pose.leftWrist.y;
  rightWristX = pose.rightWrist.x;
  rightWristY = pose.rightWrist.y;

  pg.line(leftWristX, leftWristY, pleftWristX, pleftWristY);
  pg.line(rightWristX, rightWristY, prightWristX, prightWristY);

  pleftWristX = leftWristX;
  pleftWristY = leftWristY;
  prightWristX = rightWristX;
  prightWristY = rightWristY;

In this part of the code, I’m drawing lines with the last two positions of my wrists. This is probably the ugliest part of my code, because the first time I draw a line, I haven’t defined a value for “pleftWristX, pleftWristY, prightWristX, and prightWristY” yet and that’s pointed as an error in my code. So far I haven’t found a better solution for this, and the aesthetic result is exactly what I want plus I have received a couple of specific positive feedback about it. Maybe I should store the last two poses in an array and then call them?

I discovered this effect earlier in the semester, when creating this Electronic Rituals assignment.

function fractalTree();
function fractalTree() {
  let nY = pose.nose.y;
  theta = map(nY, height / 4, height - 400, 0, PI / 2);

  // Start the tree from the middle point between my Ankles
  let rootX = (pose.rightAnkle.x + pose.leftAnkle.x) / 2;
  let rootY = (pose.rightAnkle.y + pose.leftAnkle.y) / 2;

  // Calculate the distance
  let y = rootY;
  // Only keep most recent 60 data points
  if (yRoot.length > 60) yRoot.shift();

  // Calculate the average y position
  let avgY = 0;
  // Add up all the y-values
  for (let y of yRoot) {
    avgY += y;
  // Divide by the number of y-values
  avgY /= yRoot.length;

  // Calculate the distance
  let x = rootX;
  // Only keep most recent 60 data points
  if (xRoot.length > 60) xRoot.shift();

  // Calculate the average x position
  let avgX = 0;
  // Add up all the x-values
  for (let x of xRoot) {
    avgX += x;
  // Divide by the number of x-values
  avgX /= xRoot.length;

  //position the tree between the ankles  
  translate(avgX, avgY);
  if (pose.nose.y > height / 4) {
  } else {
    line(0, 0, 0, -120 * 2.9);

function branch(len) {
  // Each branch will be 2/3rds the size of the previous one

  //float sw = map(len,2,120,1,10);

  line(0, 0, 0, -len);
  // Move to the end of that line
  translate(0, -len);

  len *= 0.66;
  // All recursive functions must have an exit condition!!!!
  // Here, ours is when the length of the branch is 2 pixels or less
  if (len > 2) {
    push(); // Save the current state of transformation (i.e. where are we now)
    rotate(theta); // Rotate by theta
    branch(len); // Ok, now call myself to draw two new branches!!
    pop(); // Whenever we get back here, we "pop" in order to restore the previous matrix state

    // Repeat the same thing, only branch off to the "left" this time!

The fractal tree was created by Daniel Shiffman in the video tutorial below.

Fractal trees tutorial by Dan Shiffman

In my code, I’m positioning the bottom of the tree between my ankles and using a technique that I learned with Mimi Yin to smooth it’s movement. The thing here is that PoseNet generates a noisy and shaky effect because it receives too many different position points (1 per frame). So to create a more stable visual effect when positioning the tree related to my ankles, I’m calculating the average of the last 60 x and y PoseNet data points and using the result to position the tree – translate(avgX, avgY).

The angle of the branches is mapped to the position of my nose on the canvas. I also created a conditional statement allowing the tree to be drawn only if the pose.nose.y is greater than certain value, otherwise, the code draws a line with the same strokeWeight of the tree’s line. That allows me to dance with a simple line and generate a more surprising effect when opening the tree.

The challenge here is that I always have to adapt where to begin to draw the tree and how much I want it to fractalize according to the position of the nose. That variates according to spacial factors of the performance such as my distance from the camera and its angle. This part is a bit confusing since I’m doing some calculations to scale the whole code in the draw loop to fit it to fullscreen mode perfectly when presenting it, and that makes it tricky to imagine exactly the proportions of the canvas in relation to my body on space.

fireParticle(v1, v2);
function fireParticle(w, z) {
  for (let i = 0; i < 2; i++) {
    for (let j = w; j < z; j++) {
      let x = pose.keypoints[j].position.x;
      let y = pose.keypoints[j].position.y;
      let p = new Particle(x, y);
  for (let i = particles.length - 1; i >= 0; i--) {
    if (particles[i].finished()) {
      // remove this particle
      particles.splice(i, 1);

class Particle {
  constructor(x, y) {
    this.x = x;
    this.y = y;
    this.vx = random(-1, 1);
    this.vy = random(-5, -1);
    this.alpha = 255;
    this.distEye = dist(pose.rightEye.x, pose.rightEye.y, pose.leftEye.x, pose.leftEye.y);
    this.triangleSide = this.distEye / 3; //20

  finished() {
    return this.alpha < 0;

  update() {
    this.x += this.vx;
    this.y += this.vy;
    this.alpha -= 5;

  show() {

    stroke(255, this.alpha);
    strokeWeight(this.triangleSide / 50);
    fill(255, 30, 0, this.alpha);
    let theta = map(this.y, 0, height, 0, TWO_PI * 50);
    translate(this.x, this.y);
    triangle(-this.triangleSide / 2, this.triangleSide * sqrt(3) / 4, 0, -this.triangleSide * sqrt(3) / 4, this.triangleSide / 2, this.triangleSide * sqrt(3) / 4);


The fire effect is a particle system that I created customizing the code from this tutorial by Daniel Shiffman. The particle in my case is an orange squared triangle that rotates according to its y position. The fire particle function allows me to position the particle system in any data point provided by PoseNet, indicating the first and the last values of the 17 pose keypoints that I want to have as a basis for the fire effect. That allows me to easily play with different fire effects depending on the moment of the narrative that I am.

function ellipsoidSeed();
function ellipsoidSeed() {
  let d = dist(pose.leftWrist.x, pose.leftWrist.y, pose.rightWrist.x, pose.rightWrist.y);

  graphics.background(0, 0, 0, 0);

  graphics.fill(255, 255, 255, 0);
  graphics.stroke(255, 255, 255);
  graphics.translate(0, 0, 2 * d);
  graphics.ellipsoid(50, 40, 50);

  angle += 0.01;

  fill(0, 255, 0);

  image(graphics, ((pose.leftWrist.x + pose.rightWrist.x)) / 2 - width / 2, pose.leftWrist.y - height / 2);


The x position of the rotating ellipsoid seed is in the middle point between the leftWrist and rightWrist points and the y position is in relation to the height of the leftWrist. I’m working with the createGraphics() function, which allows me to overlay a WEBGL canvas to my 2D sketch. And then, I’m calculating the distance between my left and applying it to the z position of this canvas, which creates the illusion that I’m changing the size of the ellipsoid, while I’m moving it back and forth, that is what allows me to invite the audience to enter the ellipsoid, positioning it right in the limit of the z-axis. The challenge here is that I need to adjust these relations every time the canvas changes its size or according to the size of the space where I perform, or how far I’m going to be from the webcam.

the audience is invited to see how the seed is structured from the inside

final considerations and next steps

This process has been really rewarding for me because I feel that a lot of my past experiences and new things that I learned at ITP are being combined in a way that fills me with joy and good challenges. I’ve been feeling very encouraged to keep going with it after the amount of positive feedback that I’ve been receiving when sharing it in three different classes and talking to friends who are part of the dance community in Brazil. There is a lot of room for future collaborations within the scope of this project. The Covid19 context forced me to start thinking about the online as a space to present live performance and I want to keep going with this idea.

From one side, I envision a physical space set up, with a more computer vision approach, where the audience would wear augmented reality headsets and be all around me while I dance and guide them through the world of the Firebird. And I still want to try it. From another side, I really appreciate that I’m creating this project with relatively accessible equipment and only open source libraries and software. That adds a lot to my concept and allows me to teach these techniques in the future in more contexts, especially in the South Global, where the Firebird comes from.

I’ve been using a song called Cravo e Canela, from the album Araçá Azul, released in 1972 by Caetano Veloso. I feel that this is a great starting point and reference, but the project needs an original soundtrack, maybe gesture sound reactive graphics and spatial sound effects.

The body/choreography needs to develop further, the design of the AR costume of the Firebird can be more elaborate and also the interactivity with each visual graphic, although I like to find simplicity in complexity in this project. I’m creating live web-based performance and designing a character but I’m also embodying a human-computer interface experiment and exploring future possibilities to control computer graphics with new machine learning APIs. That is somehow part of the conceptual development of the choreography and points me to a more accessible aesthetic rather than virtuous.

what does the spirit of the fractal tree have to tell me?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: