heartBeats — Final

 

Background

I decided on using the anatomical heart over the traditional heart symbol as the core visual of my project because the human heart has  personal vulnerabilities and significance for me.

Although I am uncomfortable  with the human heart, It’s still a fascinating human organ; you lead from the heart, give from the kindness of the heart, develop love and hate from it. So, in other words, emotions are directly tied to the heart.

On the other hand, if your heart is weak, it’s debilitating. I know this first hand because my family has a history of heart issues and I’ve always associated the heart with fear, inconsistency and the unknown. Hence why my exposure  to several heart related deaths has provoked negative thoughts related to this beautiful, vital human organ we call the heart.

I want to shift my negative feelings of the heart and instead embrace it with love and have a pure, open and fearless connection.. I used my love for music from the position of a creator and listener to create heartBeats.It has four valves with push buttons attached and an FSR sensor in the body of the heart. I adjusted my final design based on user feedback, particularly the comment about creating a clear marker to give the end user a “clue” or directions to play the instrument. I also changed my stretch sensor valve idea at the final hour and decided to install tactile buttons with colorful button caps. Personally, I don’t believe the valves were visually pleasing but it worked out so much better than the stretch sensors.

I used Ableton live to map the push buttons to a drum rack where I added 5 sounds – guitar loop on pushbutton, sound FX on pushbutton, clap on pushbutton , bass loop on pushbutton and drum (heartbeat) on a FSR sensor.


 

Initially, I planned on adding a potentiometer for effects/tempo but I changed my mind.

I created variables for the notes in Arduino and assigned the notes  based on the following  MIDI Note Number Reference Table

Arduino code:

Who is this for and how is it used?

This  project was created to express a connection to the heart through what moves me. It may have formed from  my own needs, but I feel others might be able to heal themselves as well. Holding and creating heartBeats allowed me to  cope with my fears and vulnerabilities. I have hopes that it can potentially be used as a music enabled stress device (stress ball) or even as live performance in a hospital or on stage. Additionally, heartBeats could be used as an educational tool to describe heart functionalities and/or ailments.

Future iterations:

I’d like to build a 3D human sized heart, with vibrant blue and red colored veins (silicon already purchased!)

 

  • More sensors – buttons and knobs to control functions like effects and tempo.
  • I want to make use of the Pulse Sensor monitor I purchased and never used for this project.
  • Utilize sample packs from Ableton to create a true MIDI controller experience.
  • Make heartBeats wireless.
  • Add LED lights and map to tempo.

I faced many fabrication challenges with heartBeats but everything aligned once I scrapped the stretch sensors for pushbuttons. I had this idea of a musical stretching instrument since the inception of my project and I spent a lot of time trying to develop it not realizing the idea was hindering my creative process.  Sometimes, less is better.

 

Lastly, I used a wooden case enclosure to conceal all of the wires and Arduino. I wanted to laser cut a traditional heart symbol on the box to add the symbolic expression of the heart as it sheltered and protected heartBeats – my version of the human heart…

 

Overall, I had a lot of fun working on this project and the goal is to work on new iterations in the near future.




But what do you feel….

User testing results:

User testing was beneficial and extremely insightful; I was able to understand how people reacted to my instrument and the ideas surrounding how they expected it to work; I received critical feedback that I never considered in the development and design stages of heartBeats. I wrote instructions and asked people to read and interact with the interface.  I walked each user through the functionalities and how my musical instrument worked because it wasn’t completely functional. I mapped Ableton to  play a few drums from the stretch sensor  and the FSR sensor worked perfectly but the heart wasn’t connected for  testing.

Several people asked  “How does this make you feel?” and “Why did you choose a heart?”  I never communicated this in detail before because I assumed the symbolism of “heart” was enough – but I was wrong and the feedback provided much clarity for what I need to do next. For the presentation and final blog post, I’ll explain the importance of the heart and music for me, and why I chose to use an anatomical heart over a conventional heart symbol.

A list of the comments from user feedback:

“It’s scary, and too small”

“Try using a shield or structure surrounding the heart to show protection and vulnerability if that’s what you want to express. But you need to communicate what this means to you – this heart, I really like it but explain why you like it”

“Interesting choice”

“This is a dogs toy!”

“I like the idea – nice to touch and its a great musical interface but its a bad model and design. The valves are too close together for stretch sensors, try conductive thread”

“What is the connection?  this seems jarring and that could be a good thing. Try harsh or disturbing sounds for pulling the valves”

“I want to keep squeezing this thing, find a way for something to change when you squeeze this”

“Integrate real heart beat sounds, why the heart? What is the motivation and what does it represent?”

“I love it but add clues in the valves. How will I know the valves are interactive? Show me what I can do by looking at it. I need triggers”

“How are you using the valves that don’t work? What will they do?”

I also asked each person how they felt when they played with heartBeats. Most people enjoyed the idea but it was clear they wanted to understand my personal connection with the interface, users want to understand why I decided to play with a real heart. I get it, I’ll explain soon.

Heartbreaks and 808’s

This week didn’t go as planned at all. I purchased a few new props to install the Teensy (older prototype destroyed) but continued to run into issues securing the conductive rubber cord stretch sensors. The stretch sensor became my focal point of interest for this project. During my idea and project brainstorming phase, I remembered reading The Importance of Parameter Mapping in Electronic Instrument Design, by Andy Hunt in week 5 or 6 in class. This was important to me because when Hunt conducted multiple user tests on musical instruments, he found that users were more connected and engaged when energy was needed or required to operate an instrument. The users reported an overall better response and interactive experience. This prompted me to consider how using energy motivates me when I use a device/instrument and the core influence in how I chose to build my project. It made sense to design a creative tool that encourages a continuous input of energy. Music means a lot to me, and I decided to use a heart because it is a universal sign of love. Further, the heart also represents a significant life changing event for my family and I wanted express the importance of life and its symbolism using an anatomical heart.

There are three valves used —  each play a single instrument mapped in Ableton Live. The conductive rubber cord connects to each valve and an instrument starts on pull in a continuous loop. Valve one controls the Drums, valve two the Bass and valve three controls the sound effects. I attached an FSR sensor in the middle of the heart and it is used to start the melody when physical pressure is detected. This completes the beat, hence the name of the project, “heartBeats”. I got risky and added a pulse sensor I purchased during the early stages of my project. The pulse sensor will use the end users beats per minute  (BPM) to control the tempo of the music.  Ultimately, I decided to test push buttons as an alternative for the valves but I’m completely dissatisfied with this option and plan to revert back to the conductive rubber sensors.

 

 

Setbacks:

 

  • The conductive cords aren’t secure at all. They continue to pop out of the jumper wires and it makes it extremely difficult to detect a reading with even the slightest pull.

  • I struggle with fabrication but understanding this now is actually beneficial because I can use this as a motivator to make at least one beginners fabrication class priority!
  • Mapping MIDI instruments were challenging. I had to manually map each drum pack as a note in Arduino using the drum rack pad via Ableton.

What heartBeats sounds like now:

 

What I plan to figure out before the final presentation:

  • How to make the conductive cords secure!
  • heartBeats playing drums, bass, sound effects, and melodies harmoniously.
  • Adding multiple clips (instruments/sounds) for each valve.
  • Pairing pulse monitor (human BPM) with tempo.

To be truthful, I produce work…  WEEKLY. My error honestly has been not thoroughly documenting my failures which are small victories. Each week, although frustrating, brings me closer to my end goal and ultimately what I envisioned from the beginning; a functional and engaging instrument that is symbolic to me and encourages the use of  ENERGY!

Heart Project

This week I ran into a lot of challenges with my conductive rubber cord stretch sensors. I used a heart plush prototype to test the resistance of an FSR sensor and hot glue/cyanoacrylate glue to mend the rubber to the fabric “valves”. Initially, this worked well but the jumper cables began to snap with medium force.


 

 

 

Mapping:
The FSR sensor will  start the melody mapped from Ableton Live. I created a scene with several tracks. Each track has 4 different clips.

FSR Sensor:  Melody
Valve 1: Drums
Valve 2: Bass
Valve 3: Sound Effects
Valve 4: Vocals

Process:
End user will squeeze the heart to start the melody (loop)
User will pull valve one for  drum. Drum loops until the user pulls valve one  again to select a different drum pattern. This occurs for each valve until a cohesive beat is created.

Ultimately, I’d like to use silicone to sculpt an anatomical heart but its been difficult searching for a realistic (large) heart mold online.

 

UPDATE:
LALA LAB Idea Poster for heartBeats. Valve vessel music structure

Interaction I

I’ve been stuck for some time trying to figure out exactly what I want out of this project. What should it look like? How does it feel? How long will it take to build? I read Interaction Design Sketchbook by Bill Verplank for clarity and I realized I needed to shift  my thinking. I’m not a designer, so when I think in that capacity I over think  and spend a significant amount of time, wasting time. I’m a thinker, a creative – which means  my safe space is outside of the box often and at times, there isn’t even a box!

“Questions of Interaction Design” Bill Verplank

 

I’m approaching this project from a different perspective, with focus on “How do you DO?, How do you FEEL? and How do you know?” Placing emphasis on how  it works and less about aesthetics at this stage.

How do you DO?– I want buttons for full control and independence. The user should effortlessly press and use on command, without thought or much complexity.

How do you FEEL? How does this device communicate with the end-user. I’d like to evoke emotions in real-time. If the end-user is sad, this system should reflect sadness. Emotion Pushbutton/Sensor (How do you DO) trigger minor or major scales. For example, a user is happy – the sensor detects happiness and renders one of the 12 major scales:

Happy = output C Major (C, B, E, F, G, A, B, C)

How do you KNOW? I’d like a path based system (without steps). I want this experience to be an ongoing process;  actionable, experimental, expressive and free.  Move at your own pace and ability.

7-axis dimension space diagram

Priority features: Ease of use, mobility and independence.

Target audience. Open

 

 

 

Project Prompt Revised ver. 2

Harmony: Make music

I made music using the chord progressions, arpeggios and chords using Ableton/Ableton learning tool, Hookpad and Chrome Music Lab.  Hookpad was a steep  learning curve but it was fun and engaging. I really enjoy the Chrome lab tool but unfortunately, it doesn’t have a save option..

I have an idea of what I want in my mind, but it’s been difficult translating into an object. I thought about several instruments; a tool for songwriters, a music generated wearable and plenty of synthesizers. Ultimately, I’d like to build a musical instrument for an artist to use on stage as a performance piece.

Interactions:  pushbutton to play various instruments (sounds), compact, user-friendly, lightweight.

 

Project prompt revised

I changed my original idea for something more tactile. I was inspired by the video below:

[embedyt] https://www.youtube.com/watch?v=giG-goxwfGk[/embedyt]

 

Musical user Path:

I’d like to design a wearable MIDI based instrument that works with standard controls and allows the performer  to create an interactive live experience. This instrument will be designed for and used by a close friend who is an artist and nudist. She’s deeply into yoga and body movement;  ideally, the wearable will be a form-fitting body suit allowing her use her entire body as a musical instrument while moving freely and expressively to create sound.

Aural mood board:

I used the  track “cool like dat” because it has a lot of the musical elements I’m interested in using for the body suit.

[embedyt] https://www.youtube.com/watch?v=SPS5GPyshyY[/embedyt]

 

Cluster Analysis

Song – “Before I Let Go” by Frankie Beverly & Maze

 

Oblique Strategy “Cluster Analysis”

Group the rhythm, chordal and melodic elements

Body movement

 

Scene: Prospect Park – Barbecue area

As users move about the space, they hear different clusters of the song “Before I Let Go”. Each musical element is tied to their corresponding barbecue counterpart. Different musical elements begin to play as they walk throughout the designated area.

For example, users are instructed to walk towards the main component, the BBQ grill as they enter the space. The lead vocals start at 0:00, followed by the next attribute until all instruments are played harmoniously.

 

Instruments:

the barbecue (grill) – Lead vocals
sprinkler – high hats
kids playing / women laughing – tambourine
birds chirping – multiple synthesizers
dogs barking – background vocals
Utensils, plates napkins – Drums (kick, snare, toms, high hat, cymbals)
picnic table – rhodes keyboard
old men playing cards – bass
trees, grass – rhythm guitar / electric guitar
red cups – hand claps
tupperware – bongos

Project Promt

Honestly, I haven’t really thought much about a  project. I have so many ideas, but I’d like the core of my project to produce some sort of data output. It took me a long time to call myself an artist, particularly because I’ve had writers block for some time now — a few years (more than I’d like to admit). But  I AM an artist  and if I were active, I’d personally find a device that could potentially break writers block extremely useful. A system/tool enabling an artist to input  lyrics and receive returned rhyming output would be pretty amazing.

I completed an excellent Python programming class (Reading and Writing Electronic Text) with Allison Parrish a few semesters ago. We had a session dedicated to sound symbolism and pronouncing. I was really interested in the  rhymes function. I’d like to possibly integrate this function into my project – if possible.