This need not be. If software can manage some e-mail client's address collection, it can manage the storage of instrument configuration details. This too is just text, and can be kept equally compact and human-readable.

We could track instruments using any or all classification systems, but here I'm going (for reasons that will become apparent) to focus on Hornbostel-Sachs, probably the most widely cited and which results in a classification tree, each branch representing instrument families: each twig or leaf further refining and instrument's definition.

This system classifies instruments only by their basic tone-producing, i.e. construction characteristics, or “form”. Whenever we use the world 'form', we are in essence referring to visual properties: by what means sound is produced, what comprises the core body elements, and some indication of their shape.

As such, Hornbostel-Sachs wholly ignores the actual musical configuration (“function”), which describes the instrument's finer-grain 'user interface'. This encompasses physical properties such as scale or channel length, temperament or intonation, number of notes or tones to the octave, number of courses or channels, tunings, and pitch modification by devices such as capo.

In this sense, instrument classifications under the Hornbostel-Sachs system are incomplete. However, decoupling form and function has -as it happens- huge advantages, and especially in the context of online instrument modeling.

Critically, it allows the Hornbostel-Sachs classification system to be used as the tree-like basis for a web repository (data store) encompassing all known world instrument forms, their many musical configurations (function) being stored as 'foliage', or sub-trees.

These in place, pretty much any world music instrument can be modeled in it's entirety and in the browser - from it’s generic family base.

This opens any and all instruments to integration into a source-driven aggregator platform for world music visualization - encompassing any score, instrument or theory tool, and in any combination.

Worldwide, there are several music classification systems, dozens of music systems, literally thousands of instruments spanning many instrument families, and certainly a good few hundred theory tools in a range of 1-, 2- and 3D virtual shapes.

Even in their current (static) form, whether in the context of social music and dance or comparative musicology, these represent a vast cultural blind spot.

Brought online as dynamic and interactive models, they can be expected to fuel a revolution in both the breadth and depth of music teaching.

A central strength in the aggregator platform proposed here lies in precisely such a 'progressive refinement tree': simple, layered visual models, customized at each level according to immediate configuration needs. Moreover, these customizations can be drag-and-drop shared -individually or as a collection- across the user community. This post provides some pointers as to how this can be achieved.

Big, brave, open-source, non-profit, community-provisioned, cross-cultural and houdini crazy. → Like, share, back-link, pin, tweet and mail. Hashtags? For the crowdfunding: #VisualFutureOfMusic. For the future live platform: #WorldMusicInstrumentsAndTheory. Or just register as a potential crowdfunder..

A user can specify the visualizations with which they choose to populate their immediate environment.

This is as true for scores (which are also handled using data visualization techniques) as for instrument models, theory tools, physics simulations, genres, scores, bands or any of a host of other interests.

So how might an instrument definition be addressed? The core data can be stored in JSON, and comprises two parts: a <form> part, and the <function> part, i.e.:

<form><function>

The Hornbostel-Sachs ('form') designation for a lute family instrument such as (yawn) guitar would be 321.322.

The build designation might for example comprise elements for scale or channel length (L), temperament or intonation (T), number of notes or tones to the octave (N), number of courses (C) and number of strings to a course (S).

Depending on the data repository (storage) implementation, our target might be a reference to a position in a classification tree (as shown to the left), or indeed to a filename, something along the lines of:

HS321_321_L66_TE_N12_C6_S1

From a user's perspective, such a file could be accessed visually (as a node in a tree), or using more or less conventional search.

As you may see, from a storage perspective such a file could be accessed -depending on storage technology, and amongst others- either as a conventional drill-down NoSQL style query, or as an ad-hoc graph database query.

So having seen how the data might be stored, what might an actual configuration session look like? With no attempt at polish, here a clip from the proof-of-concept implementation. Keep in mind that every time a 'Save' action is undertaken, data is saved to a configuration tree, as has just been described.

In quick succession, we see the definition of an Irish bouzouki, a typical violin (fiddle), a Turkish cura, a South American charango, an Arabic oud and an equal-tempered but microtonal (24 notes or tones per octave) guitar.

Any specific instrument customization can be configured in less than a minute, saved, and (potentially) made available for use by any and all users, worldwide. The only limitations are those imposed by the source music exchange format (think audio, midi, ABC and so on) and resulting music notation.

The tuning menu serves -at this point- mainly to allow crosschecking of behavior with the currently loaded score. This, score-driven fingerings and much more are demonstrated in separate videos.

What has been done for lutes can naturally be done for other stringed instrument types such as harps and zithers, or indeed any of the other high-level instrument families, such as percussion, wind, brass and keyboards.

With time it should be possible to model at least 80% of the world's instruments in this way, providing a solid base for ventures into direct, person-to-person teaching and learning online.

The only danger I see in this procedure is security. With SVG 'scriptable', there is some danger that a hacker could gain access to the system. For this reason, the SVG should be server-side generated, and the only user information provided through GUI conventional dialog elements with good safety controls.

Ok, what we have just seen demonstrated feeds directly into the wider instrument discovery process. Once an instrument has been defined, we can allow visual selection from a data tree to replace or augment (a.k.a. 'fine-tune') traditional text search.

Assuming we have saved a few instruments as described, then, how can they be used to populate the user's menus? He or she will want to see only those instruments featured that are being learned. To do this, we resort to drag and drop.

Menu Population with User Preferences

Assuming the user has 'discovered' a few of instruments tucked away the visual classification tree (accessed through the upper, horizontal menu bar), it makes sense to use these to directly populate other parts of the graphical user interface (GUI) - and in particular the user's menus.

When populating menus, we would be free to select not just individual end points ('leaves" of the tree), but entire twigs or branches.

A further benefit is that of code reuse. With all data sharing the same basic tree structure, code used to manipulate one tree can be reused across all others.

In our case, then, we are likely to end up with two levels of data population: the first populating a given user's menus, the second a working selection from these menus to populate the current animation panel for actual use.

With a 1:1 classification tree to menu relationship, this allows us to populate all the menus individually. This is illustrated in overview in the diagram below. Here we see a selection from the top-level 'Genres' menu, which accesses the corresponding classification tree.

From the Genres classification tree, a 'Cantes Flamencos' branch is drag-n-dropped onto the user's 'Genres' menu. The leaves of this tree now represent his or her genre preferences.

Where subsequently selected, this may be used either to pull in relevant genre-oriented contextual information, or limit other selections to this genre.

This also gives us some idea of the potential for the creation of -for example- individual exercises (notation) touching on every aspect of music theory, but integrated across the entire instrument and theory tool spectrum.

Population with User Preferences

Assuming a live video chat session, sharing one's environment with others will ultimately be as simple as dragging individual menu items (or indeed entire menus) over a new learner's avatar. This simply mirrors the way menus are populated in the first place: only the source is different.

Friday, September 15, 2017

The 12-tone, equal temperament music system is just one of many in use worldwide, but it’s tonal homogeneity and compatibility across octave boundaries (the ease with which a wide range of instruments can play together) and the vast and extremely accessible pool of teaching material has helped it to more or less worldwide commercial dominance.

The world's 'other' music systems -treasures in their own right- are barely acknowledged, and poorly understood.

Using data visualization techniques, could awareness of alternative, experimental and world music systems be raised?

In a post-singularity, free-time society, could we fuel a surge of interest in comparative musicology and ethnomusicology?

Some may need extended, but all the technology components lie to hand.

Big, brave, open-source, non-profit, community-provisioned, cross-cultural and crackpot crazy. → Like, share, back-link, pin, tweet and mail. Hashtags? For the crowdfunding: #VisualFutureOfMusic. For the future live platform: #WorldMusicInstrumentsAndTheory. Or just register as a potential crowdfunder..

World Music Systems

In world music, different conventions apply: other temperaments and intonations, number of notes per octave, scales, modes and tonal (read 'cultural') spaces.

Though compatibility can on several levels be severely compromised, this is a vast and (for ‘western’ ears) often untapped field of exploration.

We lack on the one hand the cultural experience and exposure to truly appreciate the qualities of much of this music, and on the other, the comparative musicology tools to relate them back to what we do understand.

With our plans for a world music visualization aggregator platform, this is about to change.

World Music Diversity

Beyond already diverse world music instrument base configurations, musical diversity really takes off - in the form of tunings and musical scales. There are many thousands of each.

On a lute-like instrument, for example, tunings applicable to a low-order number of courses find reapplication each time a string is added to the configuration. In this sense we can immediately begin to think of hierarchy and reuse.

Complicating this is that what may be (say) a 'B' tone in one culture may have a different name, pitch and frequency in another culture. In this sense, we need to abstract down to core, measurable physical qualities.

Much the same applies to musical scales. To get an idea of the scope, we might try to gauge how many musical scales there are there in use worldwide.

Only 1% of musical terrain is represented by the 1490 scales known alone to 'western' or classical music. Yet this is the fruit of just one of perhaps hundreds of musical systems in use worldwide.

Brave souls on Quora looked at this ballpark in a little more detail, but could still only give vague guesstimates. The fact is, we simply don't know.

Indigenous Turkish, Arabic, Persian and Indian music, like many other world music systems, are microtonal in nature. While often sharing the concept of scales and modes, their intervals and note frequencies and the laws determining them are a world to themselves.

Turkey has more than 100 makams or maqams (microtonal scales) based on a system of just intonation.

The Persian Dastgah system has some overlap with Maqam (namely Rast) but otherwise on the whole the tonalities used differ.

Arabic Makams or Maqams again share the affinity with Rast, but are anchored in a 24-tone (ie microtonal) equal temperament system.

The Indian Raga is shared by both North Indian (Hindustani) and South Indian (Carnatic) music traditions, and based on an octave with 22 srutis or microintervals of musical tones or 1200 cents.

Musical Innovation

Experimental music stretches such envelopes yet further. Musical interfaces of the future will allow much greater (possibly even shape-changing) configuration flexibility, enhanced interfacing sensitivity, tonal variety and new modes of interaction, opening up a galaxy of new possibilities.

With growing experience, the tonal system on which a piece of music is based may, for dramatic effect, even be allowed to change with time.

Music is undergoing an explosion of experiment and development - entirely paralleling that of science and technology in it’s depth and reach.

It's helpful in this context to see the various instrument configurations and underlying music systems as respectively nodes and vectors in a musical continuum.

With it’s roots in mathematics -which lends itself to algorithm- artificial intelligence and machine learning can be expected to bring both new freedoms and new challenges. In modelling it's parts, I hope we become more familiar with the whole.

Comparative Musicology And Ethnomusicology

Notation is simply a base for tonal mappings, one notation source sometimes relevant to several tuning systems. That is to say that (for example) instruments tuned to a system other than the western equal temperament can in some cases still be played while referring to a standard western score.
Concepts close to this, such as so-called transnotation, though long debated by ethnomusicologists, have yet to find their way into musician's toolsets.

Change is, however, imminent. Tonal systems -whether on an instrument model or on the basis of a theory tool (an abstraction)- can increasingly be visually compared. This draws us into the world of comparative, or 'what if?' musicology, but also ethnomusicology.

There are often many ways of visualizing these tonal differences: waveforms, relative note positions on a linear or logarithmic scale, chromatic circle or circle of 5ths, frequency scatterplots, coordinate systems, chromatic helix, and so on. The tools have to date been short of one critical component: a system to allow their automatic association with scores, and for them to be exchanged at will. The concept (and demo) exist: the idea simply needs financed.

Potential Crowdfunder?

First Name

Email Address

Indeed, with many graphical theory representations available, we have the opportunity to compare theory tools, side-by-side on the screen.

Music theory is only one side of the story: we have the possibility of comparison not just in an abstract theoretical sense, but in the sense of practical impact on instrumental use.

At a simple level we may wish to compare the impact of various equal temperament guitar tunings on fingering (and hence, ultimately, tension, timbre and dynamics).

We can also answer tricky musical questions - visually. Could a just-intoned turkish bağlama or saz can be played alongside a 12-tone equal-temperament instrument, such as clarinet or african kora? Just how far apart, tonally, are the notes? What happens if we move to a higher octave?

Going further afield, how about comparison of the note namings, frequencies and intervals of the arabic ney with that of the traditional chinese Dizi (transverse flute) or it's modern counterpart, the Xindi?

This would also helpful to an established player on one instrument curious to know which other instruments (and especially timbres) would be available at little or no learning overhead. In this way, a search based on the violin's musical configuration characteristics would reveal that banjo, bouzouki and mandolin are very similar, all sharing the same layout and pitch classes, if not necessarily the same scale length or octave range.

Answers to questions like these could be approached visually in a multitude of ways.

At the End of the Day

The possibilities opened by music visualization go way beyond instruments and theory tools, covering everything from (for example) graphical overviews of musical compatibility, evolutionary tracking diagrams, cross-cultural influences, to musical development under migration. The only limitations are imagination and time, both of which will be in surfeit in a free-time world.

Moreover, with time, data collection relating to these developments could theoretically be automated, leading to running updates to such animations.

If the fuel on this journey is data, then the vehicle is a world music aggregator platform. Well-founded, I suspect this initiative has a vibrant future. Somewhat surprisingly, this has never before been attempted.

It's clear there are mobile device bottlenecks (speed), security issues (SVG is scriptable), and the end product (and hence value) is relatively easy hijacked - even if just as captured video. Nevertheless, I am still at a loss as to why there has been no movement. This will now change.

Saturday, September 9, 2017

Musical virtuosity: the ability to intuitively, freely and playfully navigate a genre's entire musical context (scales, their associated modes, characteristic ornaments and emotional dynamics) in a way that remains true to the expectations of a knowledgable, genre-native listener.

Given the multiple, layered challenges, it's little wonder learning an instrument is acknowledged as one of the few really effective brain training strategies.

Musical Visualisation

When learning to play a musical instrument, we currently engage three learning modes:

ear: directly replicating pitch, dynamics and tension as heard

touch (tactile or gesture navigation: how we interact with instruments, their ‘user interfaces’)

sight: reading and interpreting music notation, but also interacting with other visual media, whether static theory diagrams, video, or incidentals such as fingering charts

The last that is of particular interest to us here. Without the integration, synchronization and animation of the full range of world music instruments and theory models, it will never realize it's full teaching and learning potential.

instrument models and theory tools directly linked by shared configuration parameters, so that simple 'what-if' changes to one can instantly be reflected in the other - across the entire spectrum of world music systems.

Potential Crowdfunder?

First Name

Email Address

wider music-cultural (modal) landscapes can be explored, where, for example, a specific melody might be visualised in the context of all possible modes of it's native tonal system.

comparative world musicology (visual comparison of the musical characteristics -intervals, modes and other musical building blocks- of widely differing musical cultures) will lie within anyone's reach.

in place of the fixed fingerings of conventional scores, the opportunity to associate (map) a variety of instrument fingerings and elements of style to notation and/or instrument models.

every exercise or piece of music acts as driver for a slew of immersive study across multiple tools, contexts and instruments.

opportunities for synergies with other (non-musical) fields of study, such as mathematics, psychophysics, psychology and -in the widest sense- the visual arts.

structural music analysis as an aid to learning prioritization: which parts to practice most, where one can rely on repetition, where octavization is required to keep notes within the tonal boundaries of a given instrument, where 'best fit' tonal or chord alternatives can be found - and so on.

color, tonal/timbral and synchronization consistency across the entire spectrum of notation, instrument (finger- or keyboard) roadmap, and theory tool.

while recognizing that much world music is purely aural (no written notation) and/or oral (sung) by nature, a strong impetus is provided for the extension of exchange formats such as MusicXML to include music systems and cultures not currently represented.

A Roadmap to Virtuosity

There is no such thing as invention in isolation from all else going on around us. As part of the preparation for this project, we gathered a huge range of openly available, example information on instrumental and theory tool modeling. The vehicle chosen for this was Pinterest.

These are simply intended to provoke thought about what might be missing. If you spend any time at all amongst these, you will agree there is *vast* potential for social value generation.

I see this as fostering, in it's own way, both diversity and virtuosity. As shown in the diagram, however, there is another dimension: that of peer-to-peer connectivity. This is dealt with in a separate post. Can you find it? :-)

Thursday, September 7, 2017

Only a tiny proportion of musical instruments have an interactive internet presence. Apart from modeling them consistently and economically in the browser, there is the challenge of their efficient and simple retrieval.

Driving instrument development are the dynamic, tonal and timbral needs of instrumentalists. Those of a folk instrumentalist can be very different from those of a classical player. Whether fast and reactive, high or low in pitch, percussive, loud, brash, mellow and smooth or with wide tonal range, each instrumentalist's demands lead to different construction forms.

Potential Crowdfunder?

First Name

Email Address

Just as the simple shepherd's whistle -progressively equipped with additional holes and levers- developed towards the various forms of sophisticated modern flute, to be of lasting use, our storage mechanisms need to reflect the progressive and continuing refinement at the heart of instrument development. This implies hierarchy, reuse and intuitive extension.

Could the Hornbostel-Sachs musical instrument classification system, taken together with each instrument's purely musical configuration, provide a framework for online instrument model storage?

Of clear promise in accommodating instrumental modeling diversity across online teaching and learning environments, just how might this work?

Musical Instrument Classification

Several classification systems for musical instruments exist, chief amongst them perhaps that of Hornbostel-Sachs.

This widely-used system splits musical instruments into families based on their their tone-producing (and hence construction) characteristics, or 'form'.

Though challenged by more recent (so to say 'abstract') instrument interfaces, Hornbostel-Sachs is a good fit for conventional instruments widespread use in social or community music, dance and online, person-to-person teaching.

Chordophones (stringed instruments), for example, are split into various sub-families such as zithers, harps, lutes and their hybrids.

The 'leaves' on this instrument tree are specific instruments. Some have a unique index, while -perhaps as a result of parallel development across geographically separate cultures- others share an index.

This hierarchy represents not just a highly structured tree of instrument construction (form) definitions, but it's nodes are the perfect place to store details of the associated musical configurations ('function').

Because instrument model configuration is a layered and strictly sequential process, the hierarchy of "function" nodes are a wholly predictable product. Across all variants, these too form (sub) tree structures based on key-value pair nodes.

We could of course just implement each configuration set as a single, monolithic node, but this would impede visual selection (from the classification tree) of multiple instruments sharing the same characteristics.

Say I play fiddle/violin, and want to see how other 4-stringed lute family instruments would behave under the same score, in the former scenario, selecting the node representing 4 strings would immediately identify instruments such as banjo, eukele, mandolin, bass and bouzouki, possibly making them available for drag-and-drop population of my environment's menus.

In the latter scenario I would be obliged to do a database search.

Using a text-based file format such as JSON, such definitions are straightforward to create, store, access, reconstruct and manipulate.

Here (in a screenshot taken from the code editor 'Sublime Text') an example of such a JSON file applied to the classification of stringed instruments (chordophones), but with the details 'collapsed'.

JSON is widely supported by development tools, allowing the same principles to be applied to other classification hierarchies.

User-set variables such as tunings are best catalogued and mapped-to separately as user preferences.

Most front-end frameworks rely on non-semantic routing and hence require special measures to visualise both data and routes.

In our case, and as seen in the screenshots above, the instrument and routing hierarchy are one and the same, meaning a single visualization library provides the means for the data tree to be interrogated, browsed, added to and if necessary rearranged.

In place of some predetermined view returned by routed query, we have instrument-specific JSON configuration data returned by static URL, and used to build the instrument directly from it's parent (instrument family's) generic model.

For a multi-instrumentalist, this may in sum (and as hinted at in the illustration to the left) mean: