jump to navigation

Audio Cubes May 25, 2007

Posted by Melissa Quintanilha in innovative interfaces, music, physical interaction design.
add a comment

audio-cubes.jpgThis is a Tangible User Interface (TUI) consisting of cubes that contain Digital Signal Processors (DSP) with optical sensors and emitters (infraref and LEDs). These sensors and emitters receive and send audio signals which are generated or processed by the signal processor in the cube.

By positioning the cubes relative to each other and moving them around, a signal processing network can be created. Audio Cubes proposes a new way of interacting with sound and music.

This project was done with Max/MSP: http://www.cycling74.com/story/2007/4/30/132117/165

More information at http://www.percussa.com/

Advertisements

The Unseen Video, a weather controlled, dynamic music video March 30, 2007

Posted by Melissa Quintanilha in ambient displays, music, visualization.
7 comments

unseen-video.jpg
The Unseen Video is much more than a normal, static music video. It is a video that is affected by the weather and local time from the position of the viewer.

It creates new synergies between the music, the video and the surroundings of the viewer. Every little change in your environment ensures that you will never see the same video twice. The look of the video might slightly change within an hour, but will have a whole new character in a few months.

More information: http://theunseenvideo.com/

Volume February 19, 2007

Posted by Melissa Quintanilha in innovative interfaces, music.
1 comment so far

volume3.jpg

Volume is a luminous interactive installation that was presented this winter at the Victoria and Albert Museum in London.

Volume is a sculpture of light and sound and consists of an array of light columns positioned dramatically in the centre of the Madejski Garden at the V&A. The installation responds to human movement, creating a series of audio-visual experiences. Visitors at the V&A were invited to step inside and see their actions at play with the energy fields throughout the space, triggering a brilliant display of light and sound.

Volume is created by United Visual Artists in a collaboration with Robert Del Naja of Massive Attack and his long-term co-writer Neil Davidge.

Volume was exhibited at the V&A from 24 November 2006 to 28 January 2007, and was presented in collaboration with the Playstation Season.

reactable February 18, 2007

Posted by Melissa Quintanilha in innovative interfaces, music, physical interaction design.
add a comment

The reactable is a multi-user electro-acoustic music instrument with a tabletop tangible user interface. Several simultaneous performers share complete control over the instrument by moving physical artefacts on the table surface and constructing different audio topologies in a kind of tangible modular synthesizer or graspable flow-controlled programming language.

The reactable hardware is based on a translucent round table. A video camera situated beneath, continuously analyzes the table surface, tracking the nature, position and orientation of the objects that are distributed on its surface, representing the components of a classic modular synthesizer. These objects are passive without any sensors or actuators, users interact by moving them, changing their position, their orientation or their faces (in the case of volumetric objects). These actions directly control the topological structure and parameters of the sound synthesizer. A projector, also from underneath the table, draws dynamic animations on its surface, providing a visual feedback of the state, the activity and the main characteristics of the sounds produced by the audio synthesizer.

More info: http://www.iua.upf.es/mtg/reacTable/

Basic demo 1:

Basic demo 2:

musicBottles February 18, 2007

Posted by Melissa Quintanilha in innovative interfaces, music, physical interaction design.
add a comment

jazzbottles.gif musicBottles is a project from MIT’s Tangible Media Group. It introduces a tangible interface that deploys bottles as containers and controls for digital information. The system consists of a specially designed table and three corked bottles that “contain” different sounds.

Custom-designed electromagnetic tags embedded in the bottles enable each one to be wirelessly identified. The opening and closing of a bottle is also detected. When a bottle is placed onto the stage area of the table and the cork is removed, the corresponding instrument becomes audible. A pattern of colored light is rear-projected onto the table’s translucent surface to reflect changes in pitch and volume. The interface allows users to structure the experience of the musical composition by physically manipulating the different sound tracks.

More info and videos at: http://tangible.media.mit.edu/projects/musicbottles/

Amebeats February 9, 2007

Posted by Melissa Quintanilha in innovative interfaces, music, physical interaction design.
3 comments

amebeats1.jpg

The Amebeats Project (developed by Ohio State student Melissa Quintanilha) proposes a new approach to human-computer interaction by allowing people to mix sounds by manipulating physical objects instead of twisting knobs or clicking on a music production software. The amoeba shaped board has little boxes in its center that when moved to the arms, activate different sounds.

ro-e-amebeats.jpg

My interest in music and design merged to create a haptic interface (based on touch) that allows people to use gesture to mix sounds with their hands. My inspiration for this robotic installation came from going to parties and seeing DJs create the music on their tables, but no one knowing what they do to make the sounds. Generating music using gesture allows for a much more expressive way of creation.

This project was created in the fall/2006 and exhibited at The Ohio State University on December/2006.

giovana.jpg
giovana-e-amebeats.jpg

Videos:

Visibility of creative performance February 7, 2007

Posted by Melissa Quintanilha in innovative interfaces, music, physical interaction design.
add a comment

The value we place in visibility of creative performance is exemplified by live musical performance.

While the music itself is more intricate and polished in studio recordings, audiences still pack concert venues because live performances permit listeners to witness the act of performance as well as co-produce the event (musician and audience respond to each other by mutual feedback).

With the spread of software synthesis and sequencing, laptop performers of electronic music became common, where a lone musician sits behind the LCD screen. Because performers sat motionless behind their computers (except for some mouse-clicking) the act of performance, although still taking place, was rendered invisible and as a result audiences became both disengaged and suspicious. “How do I know the performer is not just checking his e-mail?”

audiopad-2004-web-01.jpg

As an antidote, Audiopad reestablishes visibility of performance by creating a synthesis interface comprised of a projected tabletop display with several control pucks.

Audiopad is a composition and performance instrument for electronic music which tracks the positions of objects on a tabletop surface and converts their motion into music. One can pull sounds from a giant set of samples, juxtapose archived recordings against warm synthetic melodies, cut between drum loops to create new beats, and apply digital processing all at the same time on the same table. Audiopad not only allows for spontaneous reinterpretation of musical compositions, but also creates a visual and tactile dialogue between itself, the performer, and the audience.

Audiopad has a matrix of antenna elements which track the positions of electronically tagged objects on a tabletop surface. Software translates the position information into music and graphical feedback on the tabletop. Each object represents either a musical track or a microphone.

Tune Me January 30, 2007

Posted by Melissa Quintanilha in innovative interfaces, music, physical interaction design.
1 comment so far

tune_me.jpg

Tune Me is an immersive conceptual radio based upon tactile features. The sound (as well as the visual) is triggered by a number of ‘touchy’ interfaces. The visitors enter the ellipse-shaped space, immersing themselves in a new world where to listen to the radio waves. In this extent ‘Tune Me’ is a representation of the ambient radio of the near future. As well as the sound, each channel provides light features as well as vibrating and pulsing experience. When choosing the different FM stations, the overall space changes, defining different moods upon the nature of the different content. News, sport, classical music and international pop. Each of them triggers a different visual experiences, the space vibrates, pulses and interacts with the visitors.

Here are some more pictures of an exhibition in London (2005).

Monome January 30, 2007

Posted by Melissa Quintanilha in gadgets, innovative interfaces, music, physical interaction design.
add a comment

This example goes along the lines of music creation that i’m speacially interested in.

The monome is a reconfigurable grid of sixty-four backlit buttons.
Buttons can be configured as toggles, radio groupings, sliders, or organized into more sophisticated systems to monitor and trigger sample playback positions, stream 1-bit video, interact with dynamic physical models, and play games. Button press and visual indication are decoupled by design: the correlation is established by each application.
Demonstration video

http://monome.org/