Papers by Kristian Nymoen
The paper reports on the development and activities in the recently established fourMs lab (Music... more The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the fourMs lab is centred around studies of basic issues in music cognition, machine learning and robotics.
Abstract. The paper presents the interactive music system SoloJam, which allows a group of partic... more Abstract. The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a “band-like” setting. It allows the participants to take turns playing solos made up of rhythmic pattern sequences. We specify the issue at hand for allowing such participation as being the requirement of decentralised coherent circulation of playing solos. This is to be realised by some form of intelligence within the devices used for participation.
1. Background: Carrying out research on music-related body movements involves working with differ... more 1. Background: Carrying out research on music-related body movements involves working with different types of data (e.g. motion capture and sensor data) and media (i.e. audio, video), each having its own size, dimensions, speed etc. While each of the data types and media have their own analytical tools and representation techniques, we see the need for developing more tools that
Journal of New Music Research, 2016
People tend to perceive many and also salient similarities between musical sound and body motion ... more People tend to perceive many and also salient similarities between musical sound and body motion in musical experience, as can be seen in countless situations of music performance or listening to music, and as has been documented by a number of studies in the past couple of decades. The so-called motor theory of perception has claimed that these similarity relationships are deeply rooted in human cognitive faculties, and that people perceive and make sense of what they hear by mentally simulating the body motion thought to be involved in the making of sound. In this paper, we survey some basic theories of sound-motion similarity in music, and in particular the motor theory perspective. We also present findings regarding sound-motion similarity in musical performance, in dance, in so-called sound-tracing (the spontaneous body motions people produce in tandem with musical sound), and in sonification, all in view of providing a broad basis for understanding sound-motion similarity in music.
Submitted for the International Computer Music Conference (ICMC2008), Belfast, Northern Ireland, Aug 1, 2008
The paper presents some challenges faced in developing an experimental setup for studying coartic... more The paper presents some challenges faced in developing an experimental setup for studying coarticulation in musicrelated body movements. This has included solutions for storing and synchronising motion capture, biosensor and MIDI data, and related audio and video files. The implementation is based on a multilayered Gesture Description Interchange Format (GDIF) structure, written to Sound Description Interchange Format (SDIF) files using the graphical programming environment Max/MSP.
Our research on music-related actions is based on the conviction that sensations of both sound an... more Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression" musicrelated actions" is here used to refer to chunks of combined sound and body motion, typically in the duration range of approximately 0.5 to 5 seconds. We believe that chunk-level music-related actions are highly significant for the experience of music, and we are presently working on establishing a database of music-related actions ...
Acta Acustica united with Acustica, 2010
Abstract: In our own and other research on music-related actions, findings suggest that perceived... more Abstract: In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people's minds when they perceive or imagine music. Chunks are here understood as holistically conceived and perceived fragments of action and sound, typically with durations in the 0.5 to 5 seconds range. There is also evidence suggesting the occurrence of coarticulation within these chunks, meaning the fusion of small-scale actions and sounds into more superordinate ...
Lecture Notes in Computer Science, 2012
ABSTRACT This paper presents an experiment on sound tracing, meaning an observation study of how ... more ABSTRACT This paper presents an experiment on sound tracing, meaning an observation study of how people relate motion to sound. 38 people were presented with 18 short sound segments, and instructed to move their hands in the air while pretending that the sound was created by their hand motion. An advanced motion capture system was used to record the position of the hands of the participants. We have identified several relationships between sound and motion which is present in the majority of the subjects. A clear distinction was found in onset acceleration for movement to sounds with an impulsive dynamic envelope compared to non-impulsive sounds. Furthermore, the movement in the vertical direction has been shown to be related to sound frequency, both in terms of spectral centroid and pitch. Moreover, a significantly higher amount of acceleration was observed for non-pitched sounds compared to pitched sounds. The analysis of the recorded data is presented as statistical comparisons between different sounds, between different subjects, and between different sound classes.
8th International Gesture Workshop, Bielefeld, Feb 1, 2009
A central issue in the study of music-related gestures, both those of performers and those of lis... more A central issue in the study of music-related gestures, both those of performers and those of listeners, is how we segment the stream of human movement and of sounds into somehow perceptually meaningful chunks. In this paper, we shall on the background of our ongoing work in the Sensing Music-Related Actions project1 present a model for chunking musicrelated gestures based on coarticulation. We can define coarticulation as the fusion of otherwise distinct events, meaning both action events and sound events, into larger and ...
ACM Transactions on Applied Perception, 2013
The Journal of the Acoustical Society of America, 2008
From our studies of sound‐related movement (http: ̸̸musicalgestures. uio. no), we have reason to ... more From our studies of sound‐related movement (http: ̸̸musicalgestures. uio. no), we have reason to believe that both sound‐producing and sound‐accompanying movements are centered around what we call goal‐points, meaning certain salient events in the music such as downbeats, or various accent types, or melodic peaks. In music performance, these goal‐points are reflected in the positions and shapes of the performers' effectors (fingers, hands, arms, torso, etc.) at certain moments in time, similar to what is known as keyframes in ...
Recording music-related motions in ecologically valid situations can be challenging. We investiga... more Recording music-related motions in ecologically valid situations can be challenging. We investigate the performance of three devices providing 3D acceleration data, namely Axivity AX3, iPhone 4s and a Wii controller tracking rhythmic motions. The devices are benchmarked against an infrared motion capture system, tested on both simple and complex music-related body motions, and evaluations are presented of the data quality and suitability for tracking music-related motions in real-world situations. The various systems represent different trade-offs with respect to data quality, user interface and physical attributes.
Proceedings of the 1st …, Jan 1, 2011
Uploads
Papers by Kristian Nymoen