Creating music interfaces

Original author: Pablo Stanley
  • Transfer
An alarm bell, a characteristic squeak with which the doors of a car open - we constantly interact with technology through sounds. Nevertheless, many maintain the belief that the interface transmits information primarily through the screen, while the huge potential of the sound language is ignored.



I have been involved in design for 16 years, and in my free time I write music. Possession of these skills confirmed me in the idea that interfaces should combine at least sound and image. The team from Udemy, where I am currently working, is currently developing a new approach to training. During the brainstorming, an idea arose to connect sound signals to the intermediate screens. I became interested and began experimenting with the synthesizer and MIDI samples to provide the user with audio feedback during the course and completion of the course. We tried different instruments, chords and tempo. The difficulty was that the audio content meaningfully demonstrated progress and, at the same time, expressed the values ​​of our company. What sounds can tell about us? As a result, we settled on short, unobtrusive motifs in A Major, played on marimba and harp.

After this experience, I wondered ... what if instead of using sound signals in the interfaces as an audible feedback for the user, we would use harmonies, notes and chords as symbols? What if we would choose a tool or a set of tools that is suitable for our brand, is in tune with the “voice” of our product? What if music was used in such a way that the user could read the message embedded in it?

Despite the fact that hearing is one of the main channels for perceiving information, most of the interfaces focus on the visual. Sound feedback can improve the user experience, but developers, with rare exceptions, rely only on what can be displayed. An audio feedback helps the user, giving him the opportunity to look away from the device and do several things at once. It is also convenient because it lets you know when the action is fixed, processed or completed without using the screen. But creating designs with sound is not so simple. Many aspects need to be taken into account to make the user experience enjoyable, meaningful and practical.

I liked the described experience so much that I decided to compile a collection of musical recordings that others could use in their designs. I got more than 200 audio samples - harmonies, sequences, sound effects, sounding speech and combinations of chords performed on 8 different instruments.

Entirely this archive can be downloaded here . Well, if you are interested in my past, my tips on how to create musical interfaces, or the history of the creation of these samples - read on.



If a tree falls in a forest and there is no one around, will a notification come?

Before talking about music, let's first look at how we decipher and, ultimately, create the meanings hidden in the sounds. The audio content, even if it is not a sounding speech, is full of information that helps us better understand the environment - this process has long been a part of everyday life for us. It is enough just to listen to determine that the batter hit the ball, that someone unfastened the Velcro, or that the kettle was boiling. We use the audio feedback mechanism in devices such as televisions, microwaves, cars, toys and mobile phones. Sound interfaces can serve as a pleasant and useful addition for visual ones (or even a substitute, given the growing popularity of wrist devices).

When creating a design with sound, it is important to determine what value each of them will convey in the very early stages of work. The signal that broadcasts important information should be significantly different from those that simply accompany the visual content. The visual and auditory channels of perception are fundamentally different, so sounds can transmit information that is inaccessible to visual content. Sound uniquely reinforces the three basic principles of interactive design: visibility, responsiveness and constancy.

Audio designs can express different meanings: pattern, flow of time, call to action or warning. The possibilities are endless, but it does not follow from this that each interaction should include sound. Audio content should facilitate interaction, and not interfere, interfering or distracting. In order not to annoy the user with monotonous signals, it is best to give preference to short and simple sounds, which are informative in their very form. Thus, the sound will transmit the value that is built into it initially.

Design and music must be sung

Design is my main passion, but music also occupies a special place in my soul. The history of my relationship with music did not unfold in a traditional way, but still quite ordinary. I started by playing a punk band as a teenager (playing, by the way, terribly), then switched to synth punk with MIDI and virtual studios, then got to a nu-disco with synthesizers and arpeggiators (James Murphy would not have approved me) . For some time he “conquered” the audience with the charms of música sabrosa as part of a Latin American group, and then decided to master the “lost art” of DJing (Mexican weddings are my thing).

Over the years that I worked as a designer and wrote music for the soul, I came to the following conclusion: the creative process in both cases stands almost the same. It doesn't matter if you write a song, draw a comic or create a user experience, the goal is always the same - to tell a story. You adhere to the universal basic structure: the tie, the development of the action on the rise, the culmination, the development of the action on the fall and the denouement. The secret is to capture and hold the audience.

The similarity is not limited to one structure. Sound characteristics (pitch, timbre, duration, volume, direction) are similar to design elements (shape, color, size, texture, direction). The principles of creating music and design also have much in common (composition, form, rhythm, texture, harmony, similarities and contrasts).

Why am I telling you all this? Because I believe that in any interface, sounds and visual elements should be a single whole. For example, when creating a warning module, we can use a red color and an icon with an exclamation mark - both of these symbols are familiar to the user and cause an idea of ​​danger or risk. Similarly, you can select a high, loud sound with an unusual timbre as a warning signal. There should be a connection between visual and audio content in the interface, be it similarity or complementarity.

Blackberry compares the visual interface language with its audio component in its Earconography:

“The envelope on the icon can be of different colors, with or without a brand, located at an angle of 25 degrees - as long as it looks like an envelope, users will understand what the icon means. The same story with sounds. ”

Finding the right sound is such a mess

Choosing the right sound design depends on the purpose of your product or service, as well as its style. At a fundamental level, you can use speech sounds or sound signals in the interface - the so-called “sound icons” (earcons). Applications like Facebook, TiVo, iPhone, and Skype use beeps to create a sense of attachment to their holistic ecosystem. The use of sound icons helps tools better represent the brand in the market or emphasize the personal style of the product. Should the sound leave a feeling of something metallic or wooden? Synthetic or natural? Massive or small? Complex or simple? Answers to these questions help determine the material, type of instrument (wind, percussion, string), as well as ask a general topic.

The variability of sounds has no limits. You can change any characteristics, getting completely different results with each new combination. Moreover, sound characteristics influence each other. For example, volume affects pitch, pitch can change volume, timbre and duration can also affect each other. Going deep into all the technical details can be difficult, and the ability to hire a sound engineer does not always fit into the budget. Therefore, I would recommend experimenting a little and trusting my instincts, choosing the best sound design for your project. Or just hire a teenager from a punk band.

Ideally, music interfaces should be partly ideographic and partly metaphorical. In other words, they should contain both standard sound attributes and abstract categories like size, material, speed or weight. I like to define these two sound design options as “flat” and “skeuomorphic”. For example, when you close the dialog box in the application, you can directly reproduce the natural sound of the closing door, or you can use the synthesized imitation (or skeuomorph) of this sound with corrected timbre, speed and strength.



Music interfaces on top

Most people have a general idea of ​​music, even in the absence of experience with it or an appropriate education. Experiments with its various characteristics, such as rhythm, harmony, instrumentation, motive or pace, can help determine the meaning and purpose behind each sound.

Among the applications that masterly use music in interaction with the user, I would name monument valley and okey. The fact that both applications are gaming is not a coincidence. Game designers have long been working on the topic of using music in interfaces, and, in my opinion, developers can learn a lot from them. Using chords can add depth to an interface in which the keys of different heights are played without stopping. The harmonic movement arising from the movement of melodies can cause the listener to associate with progress, success or error. Other events, such as completion, sending (sending, downloading) or return (receiving, downloading), can be represented by modulation from dominant to tonic and vice versa.

Musical messages can also appeal to feelings. In Western musical culture, major evokes positive emotions (it’s enough to recall most of pop music), and minor melodies are perceived as sad and melancholy (for example, “Love will tear us apart” - Joy Division, “New York I love you, but you 're bringing me down ”- LCD Soundsystem). Choosing a scale can help give your product the right mood.

The archive that I compiled uses D major. I created various sequences of notes, octaves and chords that can be combined in harmony. In the future, I plan to update it as other octaves are added.

Use archive - minuet case

The archive was created by recording analog and digital synthesizers in Ableton Live. To create it, eight musical instruments were used (bell, guitar, harp, marimba, piano, whistle, flute, xylophone), several sound effects (R2D2, X-Files) and voices (both male and female).

Each musical instrument has from 20 to 40 sounds. In different combinations, they can represent the sequence of actions, success, error, warning, warning and other simple interactions. In particular, I added a few embellished chords in case you want to breathe new life and add a twist to your product.

The folder structure is quite simple: “Root folder / Tool / File”. File names are in the form "Instrument-Concept-Note-Number-Resolution." I recommend using the sounds of the same instrument for interactions of different types. But if you want to come off, you can combine two tools and see what happens.

Finishing on a high note

Music can influence the way we interact with visual interfaces. It helps the user to delve deeper into the story and penetrate it. Well-designed music interfaces can improve the experience and make the product personalized, but if used improperly, the sounds are distracting and annoying (remember flash sites of the 2000s and terrible blogs?). Audio is something very personal, and care must be taken not to cross the border in the process of communicating with users.

I hope that the archive I created will help you gain rich experience and inspire you to create amazing products.

Also popular now: