Project update

To come up with this week’s MVP, I wanted to test the technology behind my project idea.

The list of things that should work together was nothing too complicated at first sight:

  • Arduino
  • Analog sensors (attachment to hand to be defined)
  • MIDI output form the Arduino
  • Software synthesizer or other host

Sensing the “hand movement” is too much broad of a definition. Initial tests were made with the idea of measuring flexing/extension at the wrist level. I made a test video, and processed it a bit to understand that movement, but this provided not much help.

Another test video is in the making, with color markers along different tracking points of the forearm, wrist and hand. Hopefully that will give more insight about where to better capture data.

video-test-01

By watching that video, I had the idea to harvest motion from another part: thumb/index displacement.

Observation of those, led to the idea that almost one inch of movement could be captured, while not interfering, but rather augmenting the guitar playing.

thumb-index

 

At the same time, pressure was captured from the thumb/index attachment of the guitar pick. This was done using an FSR, and sending analog values into the Arduino. The wiring was standard for an analog sensor, and while testing, the FSR would be sandwiched between the guitar picks.

guitar-pick-fsr

 

arduino-fsr

The most challenging step, up to this, was getting the MIDI to work. MIDI was chosen because its a standard for musical interfaces, however, there were some steps that I didn’t realized in advanced were necessary for it to work. My code so far, get the reading from the sensors, write bytes to the serial port, and the the stream is captured by another software called hairless, that get the data and output it to a standard MIDI signal within the computer.

I could successfully map that signal into Guitar Rig, an effects processor for guitar, and could control several parameters with the FSR, while playing.

Next steps:

  • Continue testing on motion harvesting
  • Captured data analysis
  • Developing a stable prototype (no more breadboard for the FSR)

 

2 Replies to “Project update”

  1. Really interesting so far. I’m not sure I understand the motion harvesting idea, and how it would work without harming the guitar playing. The FSR pick sandwich does seem like a great means of data acquisition though – how noisy is the signal coming out? Can you discern different types of playing (one string vs. strumming across them)? Also let’s clarify the objective a bit. It looks like you’re moving towards an data acquisition technique that could potentially be used to augment playing – or maybe as a teaching tool? Or input to the environment in some way?

    1. Thanks for your comments and questions!

      I think the way I used the term harvesting is misleading… some things get lost in translation. What I meant was figure out ways of capturing movement, or rather, change in movement within the hand, while playing. I’m placing great effort avoiding any interference with the guitar playing itself; I’m trying to be as transparent to the player as possible.

      As for the noise, the sensor is analog, but midi is digital, so all the values where divided by 4 before sending them to the MIDI host. Although 127 possible values might seem a low count for audio applications, my tests showed they worked fine. I think this is dependant on the sound filter or audio generator controlled by the MIDI signal, rather than the noise or the resolution of the signal itself (at least for this application).

      I’m not able to detect different types of playing, but I’m not aiming for that, at least no for now. I do want to try some comparative test that I might need to explain further in class.

Leave a Reply

Your email address will not be published. Required fields are marked *