I wanted to give a little more background for my final concept, as I realize my last post was heavy on the technical, but didn’t really reflect why I wanted to do this project.
One thing that has always fascinated me about technology is user-interfaces. Throughout my life, I have played with lots of different buttons, sliders, mouses (mice?), keyboards, gamepads etc. as the available technology got better and better. Each one had its own feel, its own way of engaging with the technology that was unique and gave the individual experiences meaning. For example, I remember the first time I played a computer game “Descent” using a joystick. Until then I had been playing text-based and mouse-based games. I remember having to “map” all the buttons to the joystick for the game manually and being extra excited when I set up special commands to load my missiles to a side-button!
During Bootcamp, I was first introduced to the idea of “Creative Coding” when I saw my code instructor Jas make a bunch of balls bounce around on the screen, change color, size, react to the mouse etc. I was immediately struck by the types of user-interfaces we were using, and how this creativity could unfold exponentially if we could interact with the code on screen in different ways.
A few weeks later, I found myself reading Rainbow’s End by Vernor Vinge. This SciFi novel takes place in a not-so-distant future where technology and connectivity have become embedded in every aspect of life. The world is no longer just made up of reality, but of several “views” of reality that people can turn on and off in their contact lenses, altering what they see when looking at everyday items. As such, one of the most important things for people to do in this world is create, play with, and contribute to these views. In fact, there are several competing ideologies that operate on different views almost like separate games/worlds.
Each individual contributes creative code to make these views more engaging, fun, and meaningful. They are judged on their creativity and paid for their contributions. In fact, there is actually a high school subject called “Creative Composition” in which students must conduct an interactive performance piece utilizing networked content and their own custom user-interface (usually embedded within their clothing) to present an audio/visual/sensory experience through their classmates views.
This idea of “creative composition” was EXTREMELY inspiring to me combined with what had learned so far in bootcamp. My whole life I had relied on other people’s interfaces for interacting with my technology – whether games, computers, electronics etc. Most were great for the task, and some not so much (e.g. trying to play Final Fantasy VII for PC w/ mouse & keyboard vs. playing on PlayStation). I was an avid Dance Dance Revolution (DDR) geek in high school, played entirely too much Guitar Hero in college, and mastered the Wii controls for both Zelda and Link Cross Bow Training (with the zapper) in my first years working. Each interface challenged and delighted me with its possibilities – using your feet, using a weird guitar controller with buttons and levers, using a Wii remote and Nunchuk to simulate a sword and shield.
Every time I started listening to music thereafter I started imagining how I could compose along to it, what I would create and act out to the music. I first tried to implement something like this for my Major Studio 1 7 in 7 project. At the time, I was particularly inspired by the song “Let it Go” from Frozen. I had been listening to it on repeat ever since my Dad passed away last January, whenever I needed to just let myself vent/get out the sadness & frustration. For my project, I tried to use processing to make an application where you could use your mouse to draw snowflakes along to the song, kind of like you were composing with it. It came out eh… The mouse just wasn’t a compelling interface for playing to music, especially for really feeling like you could let out your emotions with it.
When we first started playing with the OSC controls in class, I knew this is what I wanted to do, I wanted to start building a platform that would let someone create along to music, drawing their own imagination on the screen as they went. At first I thought I would use my Android phone for this, after all that is what we learned in class, but I still didn’t feel it was compelling enough. I wanted controls that would allow you to feel like you were drawing/composing/creating along to the visuals.
I next looked into picking up a Kinect and seeing what I could do with that. As you know from my post last week, this didn’t end up working either. I was back to the drawing board. Luckily, I ran across some tutorials for connecting the Wii Remote to OpenFrameworks. When I first found out I could do this, I was ecstatic. This was even better than using Kinect. While in many ways Kinect would implement part of what I wanted, seeing yourself within the piece (also could be accomplished by projection mapping) it didn’t really have the sorts of exact controls I wanted. You can have things follow your hands/react to your body orientation, but you don’t have any control over creating new variables dynamically – at least not without using the mouse or having much more OpenFrameworks skills than I do.
Implementation & Testing:
So for my final implementation I decided to use the Wii Remote and Wii Nunchuk as user-interface devices to allow someone to creatively compose along to music. To test it, I hooked it up to my old Sinocidal homework project to see if the variables would pass to the items on screen and react with the GUI elements I had created. All good! However, the actions weren’t very compelling and I knew I wanted to start over and try to do a couple of things.
- Dynamically create objects by pressing a button and control them while on screen. (my thought is later you could use a switch case to cycle through an array of these and choose with was best for your composition)
- Utilize the Wii Remote & Wii Nunchuk’s built-in accelerometer and other orientation tools. These really represent extensions of one’s body and the position of the arms. By hooking these into variables it will make the experience much more interactive and movement oriented.
- Control parameters of the elements using other Wii Buttons. The thought here is that the main trigger buttons, joystick, and the orientation of each remote would control the main interactive UI elements, and the harder to hit buttons (directional pad, plus/minus) would tweak the parameters of those controls. They also are conveniently in pairs, so very natural to push parameters higher/lower using them.
So I couldn’t not show this off when I was home for Thanksgiving. After all, my entire family and extended friend network have really wanted to know just what it is I have been doing that made me drop off the face of the earth. I also thought this would be a great opportunity for some user-testing. My Mom and friends were super nice and patient with me, and gave some great feedback that I incorporated in the final version.
- Make it possible to create more items.
- Put basic instructions on the screen to get started with.
I added a little bit more in over the weekend to meet these suggestions and the final product is much better as a result.
Overall I think this is a great start towards where I want to eventually take this project – to an installation type format where the composer is actually included within the interface more directly – combined Wii & Kinect perhaps? Wii “Gloves” that borrow its awesome sensors but are more wearable? The possibilities are endless, and I can’t wait to explore more of what OpenFrameworks can do next semester.
Here is a great example of where I want to take this in the future. I think the Wii Remote/Nunchuk could also work great in this video (if gloves weren’t available).
For future projects I want to do a couple of things.
- Create more classes that draw create unique items (like the Box2d boxes) and call them with buttons/control them with others. I had some issue trying to use sinusoidal motion in the musicControl class when implementing the box class. They just wouldn’t show up if sine and cosine was anywhere in the .cpp file. I think it would be cool to draw the beat a little bit more like I was able to in Sinocidal.
- Apply effects to the video using Wii Buttons. I think it would be cool to apply fun filters to the video in real-time.
- Get the IR hooked up for the Wii Remote. I’d have had to hack my current IR bar to not just connect to the Wii, so I left this out. I think there is a ton of applications for creative composition and for gaming that one could do with the IR working (maybe duck hunt with Donald Trump for ducks??? Just saying…)
Final Code & Video
Here is my code on GitHub. (note, uploaded on personal GitHub. Had persistent errors trying to sync with class folder, kept saying error with last person to submits readme.)