Heartbreaks and 808’s

This week didn’t go as planned at all. I purchased a few new props to install the Teensy (older prototype destroyed) but continued to run into issues securing the conductive rubber cord stretch sensors. The stretch sensor became my focal point of interest for this project. During my idea and project brainstorming phase, I remembered reading The Importance of Parameter Mapping in Electronic Instrument Design, by Andy Hunt in week 5 or 6 in class. This was important to me because when Hunt conducted multiple user tests on musical instruments, he found that users were more connected and engaged when energy was needed or required to operate an instrument. The users reported an overall better response and interactive experience. This prompted me to consider how using energy motivates me when I use a device/instrument and the core influence in how I chose to build my project. It made sense to design a creative tool that encourages a continuous input of energy. Music means a lot to me, and I decided to use a heart because it is a universal sign of love. Further, the heart also represents a significant life changing event for my family and I wanted express the importance of life and its symbolism using an anatomical heart.

There are three valves used —  each play a single instrument mapped in Ableton Live. The conductive rubber cord connects to each valve and an instrument starts on pull in a continuous loop. Valve one controls the Drums, valve two the Bass and valve three controls the sound effects. I attached an FSR sensor in the middle of the heart and it is used to start the melody when physical pressure is detected. This completes the beat, hence the name of the project, “heartBeats”. I got risky and added a pulse sensor I purchased during the early stages of my project. The pulse sensor will use the end users beats per minute  (BPM) to control the tempo of the music.  Ultimately, I decided to test push buttons as an alternative for the valves but I’m completely dissatisfied with this option and plan to revert back to the conductive rubber sensors.

 

 

Setbacks:

 

  • The conductive cords aren’t secure at all. They continue to pop out of the jumper wires and it makes it extremely difficult to detect a reading with even the slightest pull.

  • I struggle with fabrication but understanding this now is actually beneficial because I can use this as a motivator to make at least one beginners fabrication class priority!
  • Mapping MIDI instruments were challenging. I had to manually map each drum pack as a note in Arduino using the drum rack pad via Ableton.

What heartBeats sounds like now:

 

What I plan to figure out before the final presentation:

  • How to make the conductive cords secure!
  • heartBeats playing drums, bass, sound effects, and melodies harmoniously.
  • Adding multiple clips (instruments/sounds) for each valve.
  • Pairing pulse monitor (human BPM) with tempo.

To be truthful, I produce work…  WEEKLY. My error honestly has been not thoroughly documenting my failures which are small victories. Each week, although frustrating, brings me closer to my end goal and ultimately what I envisioned from the beginning; a functional and engaging instrument that is symbolic to me and encourages the use of  ENERGY!

Heart Project

This week I ran into a lot of challenges with my conductive rubber cord stretch sensors. I used a heart plush prototype to test the resistance of an FSR sensor and hot glue/cyanoacrylate glue to mend the rubber to the fabric “valves”. Initially, this worked well but the jumper cables began to snap with medium force.


 

 

 

Mapping:
The FSR sensor will  start the melody mapped from Ableton Live. I created a scene with several tracks. Each track has 4 different clips.

FSR Sensor:  Melody
Valve 1: Drums
Valve 2: Bass
Valve 3: Sound Effects
Valve 4: Vocals

Process:
End user will squeeze the heart to start the melody (loop)
User will pull valve one for  drum. Drum loops until the user pulls valve one  again to select a different drum pattern. This occurs for each valve until a cohesive beat is created.

Ultimately, I’d like to use silicone to sculpt an anatomical heart but its been difficult searching for a realistic (large) heart mold online.

 

UPDATE:
LALA LAB Idea Poster for heartBeats. Valve vessel music structure

Interaction I

I’ve been stuck for some time trying to figure out exactly what I want out of this project. What should it look like? How does it feel? How long will it take to build? I read Interaction Design Sketchbook by Bill Verplank for clarity and I realized I needed to shift  my thinking. I’m not a designer, so when I think in that capacity I over think  and spend a significant amount of time, wasting time. I’m a thinker, a creative – which means  my safe space is outside of the box often and at times, there isn’t even a box!

“Questions of Interaction Design” Bill Verplank

 

I’m approaching this project from a different perspective, with focus on “How do you DO?, How do you FEEL? and How do you know?” Placing emphasis on how  it works and less about aesthetics at this stage.

How do you DO?– I want buttons for full control and independence. The user should effortlessly press and use on command, without thought or much complexity.

How do you FEEL? How does this device communicate with the end-user. I’d like to evoke emotions in real-time. If the end-user is sad, this system should reflect sadness. Emotion Pushbutton/Sensor (How do you DO) trigger minor or major scales. For example, a user is happy – the sensor detects happiness and renders one of the 12 major scales:

Happy = output C Major (C, B, E, F, G, A, B, C)

How do you KNOW? I’d like a path based system (without steps). I want this experience to be an ongoing process;  actionable, experimental, expressive and free.  Move at your own pace and ability.

7-axis dimension space diagram

Priority features: Ease of use, mobility and independence.

Target audience. Open

 

 

 

Project Prompt Revised ver. 2

Harmony: Make music

I made music using the chord progressions, arpeggios and chords using Ableton/Ableton learning tool, Hookpad and Chrome Music Lab.  Hookpad was a steep  learning curve but it was fun and engaging. I really enjoy the Chrome lab tool but unfortunately, it doesn’t have a save option..

I have an idea of what I want in my mind, but it’s been difficult translating into an object. I thought about several instruments; a tool for songwriters, a music generated wearable and plenty of synthesizers. Ultimately, I’d like to build a musical instrument for an artist to use on stage as a performance piece.

Interactions:  pushbutton to play various instruments (sounds), compact, user-friendly, lightweight.

 

Project prompt revised

I changed my original idea for something more tactile. I was inspired by the video below:

 

Musical user Path:

I’d like to design a wearable MIDI based instrument that works with standard controls and allows the performer  to create an interactive live experience. This instrument will be designed for and used by a close friend who is an artist and nudist. She’s deeply into yoga and body movement;  ideally, the wearable will be a form-fitting body suit allowing her use her entire body as a musical instrument while moving freely and expressively to create sound.

Aural mood board:

I used the  track “cool like dat” because it has a lot of the musical elements I’m interested in using for the body suit.

 

Cluster Analysis

Song – “Before I Let Go” by Frankie Beverly & Maze

 

Oblique Strategy “Cluster Analysis”

Group the rhythm, chordal and melodic elements

Body movement

 

Scene: Prospect Park – Barbecue area

As users move about the space, they hear different clusters of the song “Before I Let Go”. Each musical element is tied to their corresponding barbecue counterpart. Different musical elements begin to play as they walk throughout the designated area.

For example, users are instructed to walk towards the main component, the BBQ grill as they enter the space. The lead vocals start at 0:00, followed by the next attribute until all instruments are played harmoniously.

 

Instruments:

the barbecue (grill) – Lead vocals
sprinkler – high hats
kids playing / women laughing – tambourine
birds chirping – multiple synthesizers
dogs barking – background vocals
Utensils, plates napkins – Drums (kick, snare, toms, high hat, cymbals)
picnic table – rhodes keyboard
old men playing cards – bass
trees, grass – rhythm guitar / electric guitar
red cups – hand claps
tupperware – bongos

Project Promt

Honestly, I haven’t really thought much about a  project. I have so many ideas, but I’d like the core of my project to produce some sort of data output. It took me a long time to call myself an artist, particularly because I’ve had writers block for some time now — a few years (more than I’d like to admit). But  I AM an artist  and if I were active, I’d personally find a device that could potentially break writers block extremely useful. A system/tool enabling an artist to input  lyrics and receive returned rhyming output would be pretty amazing.

I completed an excellent Python programming class (Reading and Writing Electronic Text) with Allison Parrish a few semesters ago. We had a session dedicated to sound symbolism and pronouncing. I was really interested in the  rhymes function. I’d like to possibly integrate this function into my project – if possible.