Comprovisador is a system designed by Pedro Louzeiro to enable mediated soloist-ensemble interaction using machine listening, algorithmic compositional procedures and dynamic notation, in a networked environment. In real-time, as a soloist improvises, Comprovisador’s algorithms produce a score that is immediately sight-read by an ensemble of musicians, creating a coordinated response to the improvisation. This interaction is mediated by a performance director who does so by manipulating algorithmic parameters. Implementation of this system requires a network of computers in order to display notation (separate parts) to each of the musicians playing in the ensemble. More so, wireless connectivity enables computers – and, therefore, musicians – to be far apart from each other, enabling space as a compositional element.

Comprovisador consists of two applications – host and client. Both are developed in Max 7 using bach for its notation features, computer-assisted-composition tools and, of course, Max integration.

In addition to being a crucial part of this real-time composition and notation system, the client application (Comprovisador.client) is also a powerful standalone tool for sight-reading skills improvement. This application is freely available for download from Comprovisador’s website. Also on the website are links to performance videos and other goodies.

MOZ’Lib is a set of pedagogical tools designed to explore, at the same time: musical writing, creation and computer programming. It is currently developed by two composers based in Paris, Julien Vincenot and Dionysios Papanicolaou, with the support of the Ariane# project (funded by Franche-Comté regionm in East France).

The library includes several modules under the form of bpatchers for the Max environment,
largely based on the bach library by Andrea Agostini and Daniele Ghisi.
Each of these modules represents a composition idea or technique, allowing the user
to interact through various intuitive interfaces. At all time, it is possible to visualize the result of this idea, under score notation, to listen to this result, and export it to an engraving software such as Finale or Sibelius.

MOZ’Lib allows to generate, transform or combine musical materials, and to develop
these further into an instrumental composition or sound creation. It constitutes an easy
introduction to the vast domain of computer music and computer-aided composition.

The modular aspect of this library also bears a creative and pedagogical spirit.
Every module can be combined together, often in unexpected ways, to imagine
and realize new musical ideas. The possibility to manipulate such “conceptual bricks”,
to connect them inside a patch, brings the user to be more aware of the notion of abstraction,
which is fundamental in computer programming, but also in artistic creation.

N.B.: This first version of MOZ’Lib currently focuses on two musical parameters: melodic pitches and rhythm. Other aspects regarding polyphony (harmony, counterpoint, and orchestration) will be developed in future updates.

]]>bach with Antescofohttps://www.bachproject.net/2016/10/07/bach-with-antescofo/
Fri, 07 Oct 2016 13:09:01 +0000http://data.bachproject.net/wpsite/?p=1318Jonathan Bell uses bach.score as a tool for representing a melody which the computer tracks or follows, as well as for storing and displaying the actions to be made when a certain event is detected, in connection with Antescofo.

Antescofo (Anticipatory Score Follower) proposes a user interface made of a piano roll, called Ascograph, which does this same thing: it represents graphically the score to be detected, the current position in the score (in bars and beats), and the actions to be sent (e.g. to Max) corresponding to that particular position (and/or speed/tempo).

bach.score contains most of the necessary features required in order to behave like the Ascograph, but in a symbolic environment (a score, with staves and clefs) instead of more common DAW-like types of representations: the piano roll (or MIDI representation) of the Ascograph was therefore replaced in this project by the musical staves of bach.score, which Antescofo can follow: the score follower sends to bach the current position detected, and bach can thus fire the actions corresponding to that event.

In the following example, a live performer sings what is written on the upper “target” stave (input/score-follower). When a specific note is detected in the target voice, it triggers in real-time the harmonisation by virtual choir (the four last staves, controlling Psychoirtrist~).

Antescofo has the role of an accompanist, it follows the speed of the performer on each different take; but it also has the role of a “bach.score pilot” here: each time Antescofo detects a new event, or a new position in the score, it sends it to bach.score, which highlights the corresponding bar and beat number, and fires the corresponding actions.

The combination of bach.score representation with Antescofo has a great potential for writing and representing real-time transformation following the live performer very tightly; this type of representation also allows a better visualisation and understanding of what has or hasn’t been detected by the machine.

]]>Sirens Cyclehttps://www.bachproject.net/2016/09/29/sirens-cycle/
Thu, 29 Sep 2016 15:15:56 +0000http://data.bachproject.net/wpsite/?p=1313Since his first quartet Korrespondenz (1992), Peter Eötvös has been interested in the transcription of phonetic elements to music, in order to make instruments speak. For Sirens Cycle, Eötvös’s latest string quartet, premiered by Calder Quartet on October 1st, 2016 in London, Peter Eötvös asked Serge Lemouton to analyse three different texts in three different languages about the sirens: a short text from Kafka (in german), an excerpt from Homer’s Odysseus and a passage from Joyce’s Finnegan’s Wake.

bach was used to musically transcribe the pitch of spoken texts, previously analysed with Audiosculpt; all these transcriptions were used by Peter Eötvös to write his score (see picture below).

Jonathan Bell’s piece Au Commencement was entirely composed (and delivered to the singers) with the help of the bach.roll space-time notation environment. The SmartVox project uses filmed sequences (mp4) of these new forms of audio-visual scores together with a local web server, so that each singer receives his part directly in the browser of his smartphone.

1. For the composer, bach.roll is used as an experimental workspace. It turns traditional “pen and paper” notational sketches into an audio-visual animated notation. This offers a very different approach to time, and allows a great control over the musical material.

2. For the performers, this animated space-time notation profoundly simplifies temporal coordination between them, particularly in the case where music is more based on durations than rhythmic patterns. The performer can’t get lost, and no pages need to be turned. The audio-score (sent via an earpiece, individually to each player/singer) also frees the performer from the anxiety to have to realise something that would normally be too difficult to pitch. Therefore, such forms of augmented notation appear to be very practical for singers when they have long parts to learn, when they are asked to distance themselves from each other, or if the language requires singing in (microtonal) just intonation.

This kind of setup profoundly changes the usual relationship between composer, score and performer, and a lot remains to be experimented in this field.

1/ bach.roll and bach.score as workspaces for the composer, as symbolic DAWs

In these compositions, each note (or group of notes) was used here as a reservoir containing information about how to sculpt vocal lines, so as to combine them into a polyphony.

1/ A sample of spoken text is stored inside the filename slot of each note (or group of notes) of this bach.roll.

2/ cage.ezsampler, and psych~ then allow to retrieve all the information stored inside these notes, and “force-pitch” the corresponding sample of spoken text to the microtonal pitch displayed on the roll. (The Psych Acronym stands for : Pitch Synchronous Yin-based Choral Harmoniser. The psych~ module is still available in Max/MSP but Psychoirtrist~ is considered more completed. Psych~ and Psychoirtrist~ were implemented by Norbert Schnell at IRCAM.)

3/ Different types of information about how this sample should be played are also stored as slot content (in this example: speed, glissando, and ‘respect’ or spectral envelope), and then retrieved with the help of the bach.keys object.

This simple setup allows an infinite variety of vocal polyphony mock-ups, with an almost-like-real result. This enables the composer to experiment for instance within the tiniest microtonal inflections, harmony portamento, vibrato, ornaments….

2/ Notation as a medium which conveys information to the performer

Technology is primarily used here as a means to “augment” traditional notation, with the help of audio-scores and screen-based animated notation (screen-scores). Bach was therefore used in my piece as a means to provide singers with audio-guides (through an earpiece) and screen-scores displayed on their smartphones.

The technical challenge here was to send, almost instantly, a large amount of data (a video of several Mo) to a large amount of participants (a choir), and all on different types of platforms (Androids, iOS, OSX, windows…). Evidently, the easiest way to do this today is to rely on web technologies, and this led to the development of the SmartVox web application (based on the COSIMA’s Soundworks framework, from ISMM research team at IRCAM). This application is based on a local server and allows to send individually to each performer’s devices (via the browser of their phone or tablet), videos and audio-guides, all composed in the Bach environment. The server acts here as a sort of conductor which sends the score to each participant, and synchronises all the scores together to a common clock.

In technological terms, two things can be observed here:
Data sending and communication all happen in the browser of the participant’s smartphones, in a web page (a single-page web application), so nothing needs to be installed on the phone, there is no app to download.
This setup forms a distributed system, where the processing is distributed across many machines (smartphones, tablets…), and interact with each other in order to achieve a common goal.
In addition to the notation generated in Bach, several difficulties needed to be tackled in terms of user interface, which led to different versions of the app: a flexible one for rehearsal and a more robust and automatised version for concert.
This web application does not rely on the internet when it is running in performance. Rather, it is based on a LAN (Local Area Network) build over WIFI.

So as to get an idea of the finality which lies behind all this set up, click here to watch a full performance of Au Commencement, the first piece realised with the SmartVox app, sung here by the Mangata ensemble.

]]>my mother used to sayhttps://www.bachproject.net/2016/08/08/my-mother-used-to-say/
Mon, 08 Aug 2016 23:46:36 +0000http://data.bachproject.net/wpsite/?p=1288my mother used to say (2016) is a 10 minutes piece in three movements for fans and electromagnetic fields, by Amos Cappuccio.

Here are Amos Cappuccio’s words about his work:

The breath, with its cyclic alternation of tension and relaxation, is at the base of the dynamism of the music, vocal or not.
The inspiratory phase is usually associated with silence as it coincides with the performer’s physiological need to breathe, which must be taken in consideration by he composer during the composition process. In the same way, while writing music for objects, the composer has to fold the physiological limits and costituents of the instrument to his musical needs by calculating the objects’ reaction times.
Fans have a single expiration phase, which is characterized by three stages: “timing” (departure), “stability” (end point) and the “release” (return). Since the first and the last moment are controlled by the object’s inertia, they are considered such as the objects’ real physiological moments.
Identifying the breath as the human’s background noise, and following from the previous considerations, in this work the fan has been chosen as the “object” due to its inherent “vitality”.
I am specifically interested in the sound component which a fan inevitably brings with it and its being an object necessary to the operation of its container; both for the analogy ​of the human body, both for the association with its background noise generator ​function, which has a fundamental place in our everyday life.
Hence the choice of using electromagnetic fields as a primary source of sound, leaving the surrounding noise unaffected, so that the air’s rumor can be pictured as a carpet of sound lying under another phenomenon in the foreground.

The score is written inside a bach.roll, containing four single-staff voices (one for each group of fans to be controlled). A note in a given voice represents a single sound event to be produced by the corresponding group of fans, and the breakpoint functions contained in its slots controls the general amplitude of the sound event, the movement of the fans and the parameters of a resonant filter.
In this way, I was able to monitor each sound event in a truly compositional way. Another handy aspect of working with bach was the ability to seamlessly switch between different versions of the music piece just by saving them as .txt files and and re-opening them when needed.

This high control of the sound material is balanced by the choice of the visual installation.
The relays are the elements that generate the electromagnetic fields by interacting with the fans and the amplitude level of the final sound produced is controlled by the motion of those building elements.
Cables are used to hang the relays in the promixity of the fans. The flux of air generated by the rotational motion of the fans affects the position of the relays. The resulting motion gives birth to a self-consistent cycle, where both fans and relays act on the generation of the electromagnetic field and therefore play a fundamental role on the final sound produced. The relay’s cables, indeed, fall down from above, in such a way that the air of the fans can move them with no control from outside. This is the visual consequence of the previous choice which can in turn affect the final result.

The main idea is about generation of gradual transitions between differents musical figures. Transformation are obtained by controlling parameters of specific configurations of scales and arpeggios, such as gradual interpolations from wide to narrow ranges (and viceversa). Each of the three pieces is a different expression of a similar principle: the overal idea of continuous falling down.

Three selected bach.score objects are then played: extracted data (pitch, velocity, duration) are sent to ioscbank~ synthesis generators and mapped to jitter matrix for visualization. As a further option a midi version could be played (in order to rendere a simulation of the violin score, if needed).

Some video examples (electronics – main window of the performance Max patcher) are here, here and here.

]]>picture2scorehttps://www.bachproject.net/2016/06/07/picture2score/
Tue, 07 Jun 2016 11:59:37 +0000http://data.bachproject.net/wpsite/?p=1261Takuya Shimizu has developed tools to make orchestral scores from pictures by using bach in combination with Jitter. Within bach, one can individually set the color of each notehead in the score via slot linkage techniques, which makes it flexible enough to deal with the graphical dimension of scores.

Nikola Kołodziejczyk’s Instant Ensemble is a response to predictable and repeatable music tours. Sheet music for each concert of the seven-piece band is created live literally in front of the audience, composed in real-time by Nikola Kołodziejczyk. The music being composed is shown on the screen, visible for both: the audience, and, of course, for musicians who can read it. Music heard during the concert will never be repeated, and the musicians sight-read new pieces of music appearing before them knowing they will never see them again. The core of the band is Nikola Kołodziejczyk’s trio – Stryjo. The band is joined by four prominent local instrumentalists.

Music created by Nikola Kołodziejczyk is far from the experimental scene with regard to style. Instant Ensemble is an attempt to dramatically accelerate the creative process, in which usually placing the first note on the paper is separated by months before music can be performed live. In this case, the time is reduced to a few seconds.

Daniele Ghisi’s An Experiment with Time (2015), is both a 3-screens multimedia installation and a live piece for ensemble, video and electronics.

This work inspired by the eponym book by John W. Dunne, depicts the writing of a diary, where the main character carries out an experiment on his own dreams, and proposes different hypothesis on the nature of time. The video loop, during 46 minutes, represents the writing of a diary during a whole year (from January to December). The starting point for the musical writing is a straightforward association between months and major chords, so that the whole year loop is handled like a sequence of perfect cadences in the tempered cycle of fifths (January being B major, February being E major, and so on, till December being F# major, and hence looping). Although the internal handling of the musical material becomes more complex (different chord types are explored and different fundamentals are used occasionally to underline specific passages), everything in the piece is conceived with respect to this simple sequence, which thus represents the skeleton of the whole musical loop.

The starting point for An Experiment with Time is the corpus of segmented audio files described. This database for is composed by about 3000 tracks of classic, contemporary, rock, pop and jazz music, sampled from the whole history of western music. The harmonic transcription of each song has been computed and a chord-based segmentation has been performed, including the four standard tonal trichords (major, minor, augmented and diminished) and a few of their basic variants. This corpus has been chosen so that time can be set a parameter of the corpus itself. The relation between the historical time and the musical time is powerful enough to create interesting diffraction patterns. As an example, during June, a radio broadcasts some sort of `history of C major’ (An officer named Major C. is also a supporting character in the video, hence the word play), composed by C major samples ordered with respect to their composition year. Similar processes are used diffusely throughout the whole work.

To compose within this database, a meta-score has been set in place, heavily relying on the bach tools.
Each note stands for the fundamental of a chord, whose type is specified via the first slot. This representation is handier than having to specify all the voices of the chord, as it allows to separately control the two orthogonal and pertinent parameters: fundamental and chord type.

For each note, additional slots carry the information for the concatenative rendering: the duration of each unit, the inter-unit distance, as well as the descriptor according to which the units should be sorted (if any) and the sorting direction. Another slot narrows the choice of units so that a certain descriptor value lies in a given range; furthermore, additional slots provide amplitude envelopes (both for each unit and for the whole sequence of units). Finally, a slot is dedicated to filtering the database according to words or parts of words appearing in the file name or path; this is an extremely quick and effective way (via Unix find command) to narrow the search to a tag or a combination of tags (e.g. `Mozart’, or `Symphony’, …).

Much more in the dedicated website: www.anexperimentwithtime.com.
More on the installation version of the project here.
More on the live version of the project here.
Reference: D. Ghisi, M. Bergomi, “Concatenative synthesis via chord-based segmentation for ‘An Experiment with Time'”, Proceedings of the TENOR Conference, 2016