The ability to perceive noise and sound is of great importance for our everyday interaction with the environment. For example, auditory perception helps us to recognize and determine distances, speeds, obstacles, materials, and our own position in space [1–4]. In sports, acoustic signals, sounds, verbal agreements, and music are often used to synchronize and modulate movements. Sounds are produced by movement, e.g. when bouncing off spring floors, hitting balls and when arms and legs hit water, or are consciously generated, e.g. when the starting shot is given or when shouting in team sports. The volume of sounds is often causally related to the intensity of movement. Thus, greater energy means increased power and acceleration or deceleration resp., which results in increased volume.
Possibly due to these physical correlation between movement and sound, neurophysiological findings suggest a close relationship between the movement system and auditory brain areas. Several imaging studies have shown that noises or sounds produced by a known movement induce neuronal activation in the human brain that resembles the neuronal activation during execution of the action. This simulation can be observed especially in the mirror neuron system and has become known in recent years under the term "action-listening". [5–8]. Furthermore, 9 [9] showed in two experiments that rhythmic sounds generally cause an activation of the motor cortex in humans. The participants in experiment 1 knew in advance that they should clap according to a certain predetermined rhythm, whereas in experiment 2 they only heard the rhythmic sound without being asked to perform a movement (no action instruction). Under both conditions, the supplementary motor area, mid-premotor cortex (PMC), and the cerebellum were activated. It also became clear that people are better able to recognize the sound pattern generated by their own actions than a sound pattern generated by other persons actions and to assign it to themselves [10–13]. For auditory perception, therefore, a close perception-action link can be assumed in humans. Due to the intrinsic connection between sound and movement in space and time [14–17] and the neural connectivity described above, it seems reasonable to use auditory information to provide targeted and effective feedback for sports training and motor (re-)learning.
In the research on motor behavior, there exist many different approaches regarding the artificial generation of augmented auditory feedback (AAF). The following AAF methods were mainly considered: natural movement sounds [18–21], error feedback [22–24], rhythmic auditory stimulation [25–28], sonification [29–35] and musical movement feedback [36–38]. It has been shown that AAF is effective in a wide variety of application areas. There is evidence of efficacy in sports, e.g. rowing [39, 40], skiing [41], golf [42], cycling [43], and swimming [44], and also in movement rehabilitation, particularly in Parkinson's disease [45, 46] and stroke patients [47, 48].
So far, the choice of one of the aforementioned AAF methods and the mapping of acoustic parameters to specific movements, seems to be based primarily on the assessment of the movement or disease under investigation. For example, for gait rehabilitation in Parkinson's patients [49], rhythmic-auditory stimulation was investigated above all, since walking is an intrinsically rhythmic and repetitive movement. For movements with more degrees of freedom, such as attack-and-release actions (e.g. grasping), studies were conducted more frequently using real-time movement sonification or musical sonification [44, 45].
Movement sonification means the transformation of kinematic human motion data into sound, resulting in multidimensional motion acoustics. So far, research on gait sonification mainly considered timbre and pitch [50–53], rhythm [54–58] and tempo [59, 60]. As far as known, even if correlations of volume and distance [61], size of objects [2, 62, 63], direction and speed of movement [64, 65] and articulatory kinematics [66] are known from other research areas, these have hardly been included when using (gait) sonification. However, due to the known correlations, volume could be an easy-to-use parameter, for example, to specifically treat rehabilitation patients with asymmetrical gait (stroke patients, unilateral arthroplasty) with the help of well-shaped auditory feedback.
In a recent review paper, 67 [67] point out that the question of “what auditory components and amount of information are most relevant for motor training and rehabilitation” has not yet been sufficiently investigated. Among other things, it is unclear what effect individual parameters of sound (e.g., pitch, volume, timbre, tempo, rhythm) have on the execution of movement and motor control (cf. also 68). However, knowledge of the concrete impact of the various sound parameters in AAF considering different target groups would make the use of auditory feedback more purposeful and efficient in the future. This work aims to contribute to the clarification of the sound-parameter-motion relationship in AAF. For this purpose, we consider the parameters volume and pitch and their possible influence on the gait pattern of healthy young persons. These two parameters are taken into account since pitch and loudness perception are correlated due to the perceptual range of the human auditory system: We hear sounds loudest at frequencies between 2000 and 4000 Hz, and sounds below or above are perceived more quietly at the same sound pressure level [69]. Furthermore, correlations between pitch and range and direction of motion [14, 70–73] are well known and clearly described in the literature. A higher pitch is usually accompanied by an increase in height and velocity which also indicates a similarity to volume perception.
This study intends to investigate the influence of different volume and its interaction with pitch of real-time sonification of the ground contact on the gait pattern of healthy persons.
First, the overall volume was varied by 6 dB in three steps (loud 0 dB, normal − 6 dB, quiet − 12 dB) to determine its influence on participants' gait pattern (stride width, stride length, gait speed). Second, we hypothesized that the asymmetric loudness of sonification influences the gait symmetry of the participants. In this regard, the volume difference was varied between the right and left channel of the headphone used. Furthermore, to investigate whether pitch interacts with volume, the volume changes were applied to two groups (G1 n = 16, G2 n = 16) with different sonification pitches: G1 received a sound with a base frequency of 150–250 Hz and G2 received a sound with a base frequency of 95–112 Hz.