Anda di halaman 1dari 5

What is the Music Ontology? The Music Ontology (http://musicontology.

com/): a formal framework for dealing with musicrelated information on the Semantic Web, including editorial, cultural and acoustic information. The Music Ontology provides a vocabulary for publishing and linking a wide range of musicrelated data on the Web. Music Ontology data can be published by anyone as part of a web site or an API and linked with existing data, therefore creating a music-related web of data. The Music Ontology is built on RDF, a technology developed by the W3C. RDF enables data to be described as "triples": subject, predicate and object, i.e. "this track" "is part of" "this album". RDF can be serialized in a number of ways. The Music Ontology is specified using OWL, which provides a set of constructs to describe domain models in RDF.

Why is the Music Ontology based on RDF? It would be very difficult to tackle the many competing requirements of the music domain with a stand-alone format. By using RDF, the Music Ontology gains a powerful extensibility mechanism, allowing Music-Ontology-based data to be mixed with claims made in any other RDF vocabulary. Instead of covering all music-related topics within the Music Ontology itself, they describe the basic topics and build into a larger framework - RDF - that allows them to take advantage of work elsewhere on more specific description vocabularies. The various available RDF serializations also enable the Music Ontology to tackle a number of use-cases, e.g. RDFa to embed music data in web pages, JSON-LD to provide such data as part of an API, orRDF/Turtle for easily exchanging large databases of music information. Which ontologies does the Music Ontology build on? The Music Ontology builds on four main ontologies: FOAF, a vocabulary for describing people, groups of people and organisations. The Event Ontology, a vocabulary for describing events, from 'this performance append on that date' to 'this is the chorus of that song'. The Timeline Ontology, a vocabulary for describing time intervals and instants on multiple (possibly related) timelines, e.g. an audio signal's timeline. The FRBR ontology, a vocabulary for describing works, expressions, manifestations and items and their relationships, as defined by the Functional Requirements for Bibliographic Records.

The Timeline ontology Temporal information is the first thing we want to express when dealing with musicrelated knowledge [1]. The timeline ontology is itself built on top of two concepts defined in OWL-Time [8]: Interval and Instant, respectively representing time intervals and instants.The Timeline ontology defines another concept: TimeLine, representing a coherent backbone for addressing

temporal information. In this ontology, a single time line may have several coordinate systems allowing to address time points and intervals on it. The ontology also defines a way to relate two time lines together, through the use of time line maps (TimeLineMap). When wrapping it into the Music Ontology it was introduce several simplification of this ontology. They define one canonical coordinate system per type of time line. They also consider the existence of two main types of time lines: physical and relative (the time line of a track, for example). Fig. 1 shows how an instant on an audio signal time line and an interval on the universal time line can be represented using the Timeline ontology.

Figure 1. Describing an instant on an audio signal time line (at 3 seconds) and an interval on the universal time line (7 days starting on 26 October 2001, 12:00 UTC)

The event ontology This ontology is centered around the notion of event, seen here as the way by which cognitive agents classify arbitrary time/space regions, which is essentially the view expressed by Allen and Fergusson: events are primarily linguistic or cognitive in nature. That is, the world does not really contain events. Rather, events are the way by which agents classify certain useful and relevant patterns of change. This ontology deals with the notion of reified events. It defines one main Event concept. An event may have a location, a time, active agents, factors and products, as depicted in fig 2. In the Music Ontology we define an Event concept, having a number of factors

(such as a musical instrument, for example), agents (such as a particular performer) and products (such as the physical sound that a performance produces).[1]

Figure 2. Overview of the Event ontology

A music production work flow using the Music Ontology A full description of the Music Ontology can be found at this address. I will try to explain the use of the Music Ontology using a simple music production work flow. First, let describe the major components of this ontology. On top of FRBR, we define MusicalWorkan abstract musical creation, MusicalManifestation (which can be a Record or a Track among others), and MusicalItem, which can be a Stream, a particular CD or a particular vinyl, etc. MusicArtist and MusicGroup (note that these particular concepts can be considered as defined classesany person contributing to a musical event can be inferred as being a MusicArtist). Composition deals with the creation of a MusicalWork. Arrangement deals with an arrangement of a Musical-Work and can have as a factor a MusicalWork, as an agent an Arranger (which can also be considered as a defined class, as any role in this ontology) and as a product a Score. Performance denotes a particular Performance, and can have as factors a

MusicalWork and a Score, a number of musical instruments, equipments, and as agents a number of musicians, sound engineers, conductors, listeners,etc. A Performance can have as a product another event: Sound a physical sound. This sound may itself be a factor of a Recording, which may produce a Signal. This Signal can then be published as a MusicalManifestation. This leads to a work flow such as the one depicted in fig. 3.

Figure 3. Describing a music production work flow using the Music Ontology

Well known Music Ontology applications The Zitgist (http://zitgist.com/ )project converted the Musicbrainz metadata repository in RDF, using the Music Ontology. Every resource

in this repository is dereferencable, and it is therefore really easy to link a particular track to relevant metadata (by just stating this track is the same as that track in the Musicbrainz RDF dump). The DBTune (http://raimond.me.uk/dbtune/) project aims at publishing and interlinking several Creative Commons music repositories, in order to provide URIs and allow everyone to share annotations, features, etc. on the available tracks. Foafingthe-music [2] is describing user profiles using FOAF and relevant tracks using the Music Ontology. In order to do so, it implements a small web-service to read the ID3 tags from a MP3 file. Then, it uses that information to query other web services such as Musicbrainz and Amazon to aggregate information about the MP3. References [1] The Music Ontology, Y. Raimond, S. Abdallah, M. Sandler, F. Giasson, Proceedings of the International Conference on Music Information Retrieval (ISMIR), 2007 [2] O. Celma, M. Ramirez, and P. Herrera, Foafing the music: A music recommendation system based on rss feeds and user preferences, in International Conference on Music Information Retrieval, 2005.