QueryBag: Using Different Sources For Querying Large Music Collections
|Title||QueryBag: Using Different Sources For Querying Large Music Collections|
|Publication Type||Conference Paper|
|Year of Publication||2009|
|Conference Name||Conference of the International Society for Music Information Research (ISMIR), Demo session|
|Authors||Sordo, M., Celma Ò., & Laurier C.|
|Conference Start Date||26/10/2009|
|Conference Location||Kobe, Japan|
The rapidly increasing growth of digital music on the World Wide Web has resulted in an augmenting research interest in the field of MIR, aiming at organizing and making this vast amount of digital music easily accessible. In the context of music search, different alternatives have been proposed, ranging from content-based search (using acoustical features as a measure of similarity) to tag based-search (social tags, game-based tags, etc.).
In this demo, we present a lightweight yet visually attractive music search engine, which takes into account both acousticbased features and text-based information for querying a large music collection. We introduce the concept of QueryBag: a (visual) bag where users can drop audio files and tags, treated as objects in the application, in order to improve or customize their search results. For instance, a user might want to find songs similar to song A and song B, which have instrument C. By dragging these three objects into the QueryBag, the user can define a query combining all three elements, thus combining both audio and metadata into a single query. Using this concept of QueryBag allows the user to formulate complex queries in an intuitive fashion. The idea of QueryBag is extensible to other music information sources (such as user profiles), not only tags and audio content.
Finally, ISMIR attendees will be able to try this feature and, in the same application, compare it with other classic query types: query-by-browsing and query-by-text.