Download

Description/Abstract

Musicologists have to rely upon an extraordinarily heterogeneous body of primary and secondary research sources, even when conducting the most basic exploratory research. Although increasingly available online, data is nevertheless routinely catalogued or stored in numerous discrete databases according to media type (text, image, video, audio) and historical period (contemporary literature/sources, historical literature/sources), yet most musicological research cuts across these artificial divisions; researching Monteverdi’s madrigals, for example, could involve performing essentially the same search several times, because there are several relevant data sources (RISM, Grove, Naxos, RILM, BL Integrated Catalogue and BL Sound Archive). The musicSpace project seeks to integrate access to musicological data sources by providing a single search interface, thereby removing the need for search repetition and reducing inefficiency. The vast increase in on-hand data that comes with database integration both demands and allows for the development of far more sophisticated, intelligent and interactive user interfaces. Accordingly, musicSpace facilitates searching and encourages browsing by displaying search results and parameters using multiple panes, allowing instantaneous paradigmatic shifts in search focus, and employing a detailed subject ontology to enable the semi-automatic construction of complex searches. In this paper we present the musicSpace explorer interface and demonstrate its efficacy. We describe key technologies behind musicSpace to reflect on performance and scalability. In particular, however, we describe how we will be evaluating the system in use for research, and describe our longitudinal study to assess the impact of this integrated approach on artefact discovery and research query support.