Pcomp – midterm project documentation

Finding Words

My midterm project for physical computing is an ongoing project. I’m creating a performative poem about a cyborg bird who died in the middle of the city. Where the bird died, a tree grew, because the bird had an invisible seed inside their belly. What is this seed made of and how does this tree look like?

The first version of the wearable LED strip

To tell this story I will use gestures (Dance), Light (LED), Words and Sound (Voice, and maybe other effects). To do so, I want to create a wearable for my hands with LED’s that change its colors according to the gesture I do with my hands. I’m going to start with two words:gestures: Fire, Bird, Tree, Seed and Death.

The video below shows how far I got by my midterm deadline. Using Arduino, I created a physical interface with two switches and a led strip containing the number of LEDs I’ll have in my hands. Using serial communication, the code is connected to a p5.js interactive interface that I designed.

  • The initial state of the strip is blinking white and on the screen you see random letters floating and vibrating around the canvas.
  • If you press 1 button, the strip turns blue and you can read bird on the screen;
  • if you press another button, the strip turns green and you can read tree on the screen;
  • if you press both switches together, the strip turns red and you can read fire on the screen.
Video showing the switches changing simultaneously the color of the LED strip and the words on the screen

Arduino code

To communicate with the LEDs I started studying the Neopixel library. Some of my fellows advised me to use the FastLED instead, so I may study that option too. Thanks to Max da Silva, who helped me a lot with this code.

##Arduino Code for the project finding words

#include <Adafruit_NeoPixel.h>

// Which pin on the Arduino is connected to the NeoPixels?
#define LED_PIN     6

// How many NeoPixels are attached to the Arduino?`
#define LED_COUNT  8

Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRB + NEO_KHZ800);

const int indexSwitch = 2; //define the normally opened switch activated by the index finger in the PIN 2
const int thumbSwitch = 3; //define the normally opened switch activated by the thumb finger in the PIN 3

void setup() {


  pinMode(indexSwitch, INPUT); //define the PIN as an INPUT
  pinMode(thumbSwitch, INPUT); //define the PIN as an INPUT

  strip.begin(); // Initialize NeoPixel strip object
  strip.setBrightness(10); // Set BRIGHTNESS to about 1/5 (max = 255)
  strip.show(); // Initialize all pixels to 'off'


void loop() {

  if (digitalRead(indexSwitch) == HIGH && digitalRead(thumbSwitch) == HIGH) { //if both switches are pressed

    //fire led effect - color red
    uint32_t red = strip.Color(255, 0, 0);
    colorWipe(red, 0);

    Serial.println(4); //write the number 4 in the serial monitor

  } else if (digitalRead(indexSwitch) == HIGH) { //if the index finger button is pressed
    //tree led effect - color green
    uint32_t green = strip.Color(0, 255, 0);
    colorWipe(green, 0);
    Serial.println(2); //write the number two in the serial monitor

  } else if (digitalRead(thumbSwitch) == HIGH) {
    //bird led effect - color blue
    uint32_t blue = strip.Color(0, 0, 255);
    colorWipe(blue, 0);
    Serial.println(3); //write the number 3 in the serial monitor

  } else {
    blink(); //blink white if none of the buttons are pressed
    Serial.println(0); //write zero in the serial monitor


//define how the blink function works
void blink(){
  uint32_t white = strip.Color(255, 255, 255);
  colorWipe(white, 0);
  uint32_t off = strip.Color(0, 0, 0);
  colorWipe(off, 0);

void colorWipe(uint32_t color, int wait) {
  for(int i=0; i<strip.numPixels(); i++) { // For each pixel in strip...
    strip.setPixelColor(i, color);         //  Set pixel's color (in RAM)
    strip.show();                          //  Update strip to match
    delay(wait);                           //  Pause for a moment

Wearable and other materials research

Materials used during the process

In general, I had a lot of trouble trying to build my wearable. First of all, I chose to work with reflective fabric, which is very complicated to sew because its to thick, although it’s beautiful and connects perfectly with my concept that will try to explore themes around mineral extractivism. Thus, I will continue working with it, but need to find a more sewable version of it.

After not being able to sew the reflective fabric, I used a general fabric that we have in our soft lab at ITP to build the first prototype. By the way, Idith, who is responsible for this space, helped me a lot with advice and lending me some materials such as conductive fabric and thread. The thread that I had bought had a really bad quality – it is important to check its conductivity before using it and sewing every line two times to improve the quality of the current you get.

I chose to work with individual sewable Neopixels by Adafruit to have more freedom to place the LEDs as I wanted, and to have a more flexible and more convenient base for them, comparing to general LED strips. Therefore, I’ll have a more organic and adaptable format for my hands and gestures.

Myself presenting the project in class. Photo by Tianjun Wang s2

computational media – week 6 – midterm project

Finding Words

video showing the visual poetry moving

For my ICM midterm project, I created my first experiment with coding and words. This is a kind of interactive poetry that I’m working on to combine with physical computing, in a way that each word is going to be activated by gestures that press wearable switches. In the future, this is going to be presented as a live storytelling performance.

This is a work in progress. Find the code before I included the serial communication commented above; try it or edit it on the p5.js online editor here; and find the code after I included the serial communication to activate each word with a different button here.

//press the mouse or your keyboard or both together to see or combine the words bird and fire. 

//define each letter of the word as a different piece of data in the array identified by an index number

 let bird = ["b", "i", "r", "d"];
 let fire = ["f", "i", "r", "e"];

 //define the distance between the letters when a word is written
 let betweenletters = 20;

 function setup() {
   createCanvas(600, 400);

 //write the word putting its letters from the array together
 function renderWord(word) {
   for (i = 0; i < word.length; i += 1) {
     fill(random(0, 360), 180, 100, 100);
     text(word[i], (width / 2 - (betweenletters * 2)) + i * betweenletters, height / 2);

 //position each letter of the word in a random position in the canvas and change its color randomly over time
 function randomLetters(word) {
   for (i = 0; i < word.length; i += 1) {
     fill(random(0, 360), 100, 100);
     text(word[i], random(0, width), random(0, height));

 function draw() {

   translate(random(-1, 1), random(-1, 1)); //vibrant visual effect

   if (mouseIsPressed) {
   if (keyIsPressed == true) {

physical computing – midterm project

Finding Words

I think that some memories need to be preserved as well as some future presences, that don’t exist yet, need to materialize themselves. And to do so, I feel like telling a story. It’s a story about a cyborg bird who died in the middle of the city and, where they died, a tree grew because the bird was carrying an invisible seed inside their belly.

To tell this story I want to create a live performance containing: Gestures (Dance), Light (LED), Words and Sound (Voice, and maybe other effects). Therefore, my Pcomp midterm is going to be the first experience towards that story that I don’t know yet.

Drawing the circuit for the fire bird

My initial plan is to create a wearable for my hands, which contains LED’s that change its color effects according to the gestures I make. I’m going to start with two words/gestures: Fire and Bird. The color effects are going to be activated by two different buttons made with conductive fabric located in different parts of my hands.

As a first try connecting the wearable with P5.js animations, I would like to create a program that writes a different word on the screen according to the gesture I make with my hands.

Materials I need

  • Conductive fabric to create the buttons;
  • Flora RGB Smart NeoPixel version 2 – Sheet of 20;
  • Stainless Thin Conductive Thread;
  • Gloves (think about the fabric/design – use transparency);
  • Microcontroller – Arduino;
  • Power Supply.


  • Is it better to connect the LED’s as if they were one strip or two (one for each hand, connected to different digital ports)?
  • How to use a battery? And what’s the best battery to use in this project?

computational media – week 5

function RainbowTriangle (transX, transY)

This week I learned how to create functions and objects and how it can be applied to create repetitions and modulations. The code that I chose to do it was kind of tricky because some parts of the variables I used to create it, needed to be defined in the setup function, some in the draw, and some outside both, so I’m not sure if I organized it in the best way. Find the code here. Therefore, I’ll try to do it in another code I created before – the Rainbow Creature.

Ps: finally I got to create a rainbow gradient animation. To do so, I used colorMode(HSB) and a simple variable which goes up to 360 and turn into 0 again:

if (a >= 360) {
a = 0;
fill(a, 120, 180);

physical computing – Learning how to learn and Labs 2 and 3

Arduino, Force Sensor Resistors and LED’s

For the past few weeks, I’ve been struggling to learn Pcomp. I feel that the ability to mix coding with electronics is something that takes a lot of time to assimilate, considering that both are quite new universes for me.

Hence, I started to create new learning strategies to find ways to study on my own and organize information. First, I created a new workflow using the app Notion, which allows me to put together and link in a very visual and accessible way all the resources that I need to quickly access when studying for Pcomp. Secondly, I mapped places where I can find the information I need when doing exercises, for example, the books “Learn Electronics with Arduino: An Illustrated Beginner’s Guide to Physical Computing” and “Physical Computing: Sensing and Controlling the Physical World with Computers“, and also the page of the class itself, and the Arduino website. Thirdly, I decided to focus on doing all the labs from the Syllabus instead of trying to be creative or making up applications for the circuits that I’m learning to build. I also printed a lot of things that I need to memorize, such as Arduino’s functions and basic rules for electronics and stuck to the wall of my bedroom, so I can always look at it.

This week I focused on Labs 2 and 3. Lab 2 was about Digital Input, so it purposes using a button to change how LED’s behave. The lab went well, I just don’t understand exactly why do I need a Pulldown resistor to create a button.

Lab 2 – Button and Led

Lab 3 was focused on Analog Input. First using a potentiometer to change the brightness of an LED and then changing the led for a speaker, in a way that the analog signal of the potentiometer changes the frequency of the sound.

Lab 3 – Using forcing sense resistors to change the brightness of LEDS

Last part of the lab was variating the brightness of two different LEDs in sequence according to the pressure applied to two different Force Sense Resistors. In general, I understand well how to prototype, but I feel I need to improve my skills creating and reading the codes and getting to know better how electricity behaves in relation with data and vice-versa.

computational media – week 3

Collective Artificial Fire Flower

This week me, Name and Ray created an interactive firework set in P5.js with three different effects, related to the interactive functions MousePressed, MouseDrag and keyPressed. Although we have different levels of coding knowledge, it was very motivating to work together and come up with solutions for the problems we faced. We basically divided the work in a way that each of us was responsible for one of the three different firework effects and then we co-worked to put the codes together, which turned to be more challenging than we imagined. Find the final version here and my part of the code (done with a lot of group work) here.

computational media – week 2

Animated Rainbow Creature

Check my first animated rainbow creature here.

  • Element controlled by the mouse: Rotation/Color of the triangle; I tried to position the triangle exactly in the center of the square, but I couldn’t. Here is my try.
  • Element changing over time: The rainbow lines disappear. This is actually a mistake. I was trying to create a rainbow gradient for each line – make it change the color forever. But I couldn’t find the correct conditionals.
  • Element changing every time I run the code: the color of the eye.

physical computing – week 2

Switch Lab

This week I created an improvised wearable switch using aluminum paper modeled to fit my fingers. When I touch both fingers, the current starts to flow, activating the circuit and turning on the white LED.

This is a very simple experiment starting to explore the possibility of creating live storytelling performances using special effects activated by the hands and facial expression.

final result

Some brief thoughts and notes about pcomp paradoxes

Last week during the class, Daniel Rozin asked: What’s the price for us to have each day more innovative experiences with Physical Computing? This question keeps resonating in my mind. Furthermore, I thought about two other questions to deepen the reflection: Who has been paying the price? And who gets paid for that?

When we look at “The Treachery of Images”, by René Magritte on a big screen at NYU, we are seeing an image of a pipe that depends on a long colonial history of violent extractivism and genocide to exist. Actually, going back some levels, closer to the roots of the problem I’m trying to describe, the pipe itself, as well as the Tabacco used in it, were technologies created by the native South-Americans, who were using that to access images and virtual worlds much before than we do and without causing any harm to the environment. 

So working with technology for me is inhabiting this kind of paradox. We are working to design a better world, more comfortable ways of interacting with machines or even creating politically-engaged projects, using machines which fabrication is the cause of most of the harmful of the planet and society. 

Another technology paradox is pointed by Donald A. Norman in the first chapter of the book “The Design of Everyday Things”. The chapter is called “The Psychopathology of everyday things”. He says that technology offers the potential to make life easier and more enjoyable and at the same time, adds complexities arise to increase our difficulty and frustration. Each day we are more dependable on specific technicians to deal with everyday issues. 

I see these paradoxes as challenges and opportunities to think and design collectively different worlds – both in the everyday real world and in fictional worlds – or even better, in the intersection of both. 

computational media – week 1

Final result of the “first creature” drawing

This is my first assignment drawing using p5.js web editor. I started the design of a creature that in the future can become an animation, a performance, or both. She has no name yet and probably she doesn’t speak our language.

about the process….

First of all, it took much more time than I thought to create it using code and I found some challenges in my way. Most of them I could solve using the p5.js online library instructions. For others, I feel the need to go back some steps and learn math again, especially to remember how to deal with graphics, angles, line slopes, triangles, curves, and find specific points in the canvas.

Screenshot of the process

I couldn’t understand really well how to position arcs for instance, so I had to change a bit my initial idea for the head of the creature and leave it without hair (or eyelashes). I also tried to save the image as a .jpg file, but I could save it just a completely black or white image, so I took a screenshot to be able to upload it to this blog.

The web editor is great, I can access it from anywhere. I just had one issue with it: after turning on the automatic saving, I stopped saving the project because I thought it wasn’t necessary anymore, however, for any reason I lost my wi-fi signal for a while and once I came back to the editor, I had to refresh the page and it made me lose part of the work. Therefore, I started to save it manually all the time.

Sketches during the process of creation of the creature

Overall, I discovered I enjoy doing it, but it’s hard for me, so I need a lot of time to dedicate myself to it and do it calmly to understand every path of the process of creation. I find it very pleasant to transform math into an image, although it’s difficult and uncomfortable. I also found out that I’m more interested in 3D forms and how to create forms that would be impossible to design in the real world, but I know it will take some time for me to be able to get there.